Follow this article to find a list of Best Practice and FAQ.
If you are looking for an article that describes how to create a CSV and what data to include find it here.
Following is a list of best practice for CSVs
- Totango supports non english characters in the data set, however file names must contain English characters only.
- Name the column headers the same as the API name of the attribute where you intend the data to go
- Dates should be ISO 8601 format
- Numbers should be in US Decimal format (no special characters like $ or ,)
- If you use Excel, it often alters formatting of a file when you open and save it. So once you generate the CSV do not open/save it
- File should be saved in CSV US format with UTF-8 encoding. (If you are a non North America regionality manually generating the file whose CSV files generates using pipe or semicolon as separators you would need to either change your regional settings within your OS, or manually alter the file in a text editor to conform to comma/quote encapsulation)
- All column headers should be populated
- The first row cannot be blank
- All column headers should have a unique name (no duplication)
- Negative values are accepted for attributes, but not accepted for usage data counts
- There is no file size limitation for recurring uploads
- Due to browser limitations, a file uploaded from your local computer cannot exceed 100MB
- A CSV file cannot contain more than 1,000 columns
- Do I need to create one CSV file with all my columns?
You can have as many CSV files as you'd like the common denominator in those files would be based on the type of data you are uploading, here is a breakdown of the requirements for each type of data load:
Users: accountID, UserID
Collections: accountID, collectionID
- I uploaded data but in Totango it is showing that the value was loaded on the day after.
Your instance has a timezone set on it, data is tagged with a date based on what the instance date was when you uploaded it, not your local time.
- When loading dates for different timezones how do I account for the date change.
Your instance of Totango has a timezone associated to it. All date uploads with out a time/timezone stamp on them will automatically be converted to your instance timezone. It is recommended that all date values sent to Totango be standardized with respect to which timezone they are sent in.
In other words always convert and send dates in UTC GM format: 2021-01-04T00:00:00.0Z if you expect to have global operations with different definitions of date.
- I deleted the CSV job that I uploaded but the data is in Totango?
You should not delete the jobs in Totango, they are a historical indication of the data you have loaded. They have no relationship to the actual data set. To delete data if you are a community customer, you can find instructions here. For all other customers please contact email@example.com and provide a URL to the public segment of the data to delete.
- How do you delete users or collections uploaded via CSV?
Regardless of how the data got into Totango please contact firstname.lastname@example.org and provide a URL to the public segment of the data to delete.
- Why is my CSV in a status of Partial Success, Failed or Skipped?
You can click on the line item and look for the error message towards the bottom, it will tell you what issue there was. Partial Success generally indicates an error with the ID, like the ID was missing, or duplicated across your file. Below is a list of data load types and their required unique IDs:
Users: accountID, UserID (combination of the two should be unique in the file)
Collections: accountID, collectionID (combination of the two should be unique in the file)
- I uploaded data via a CSV, and saw that the data came over fine, however today the data has been replaced, why?
Totango is a data aggregator if you have data coming from another system like a CRM (SFDC, MS Dynamics) or a Data Warehouse the data in Totango will show the last loaded data source, so if you are loading data that is already being automatically sourced from another data source it will revert back.