Skip to main content

This site requires you to update your browser. Your browsing experience maybe affected by not having the most up to date version.

We've moved the forum!

Please use forum.silverstripe.org for any new questions (announcement).
The forum archive will stick around, but will be read only.

You can also use our Slack channel or StackOverflow to ask for help.
Check out our community overview for more options to contribute.

Customising the CMS /

Moderators: martimiz, Sean, Ed, biapar, Willr, Ingo, swaiba

Smarter duplicate detection for CSV import.


Go to End


931 Views

Avatar
diamondsaw

Community Member, 1 Post

24 April 2014 at 5:04pm

Hi all,

Let's assume I have an employee DataObject. If I'm implementing a custom EmployeeCsvBulkLoader class, I can specify a $duplicateChecks array with a key/value pair corresponding to employeeID to ensure no duplicate staff members are added to the database (assuming the employeeID is specified in the CSV file).

This is all well and good but clueless administrators may not know the next available employeeID number to add to each employee in the CSV file (and why should they?). So how can I implement my EmployeeCsvBulkLoader class to detect duplicate rows based on a combination of: FirstName, LastName, and Office??

I feel that this is a much more convenient way to import new employees. It seems like the inability to detect duplicate entries based on composite keys is a big oversight for CSV imports.

Thanks in advance.