Using the TDWebApi People Import API to sync users between your organization and TeamDynamix

Summary

TeamDynamix offers a user import service in our Web API application. This can be used by clients to post an import file to TeamDynamix containing user information. The import files are picked up by a processor and user information is updated or created as necessary for your organization.

For all versions of TeamDynamix on or after 10.3, refer to the TDAdmin > Users & Roles > Import Jobs > Help page for available field help and usage details. The Import Job Help page will provide details on all available columns, how to format column headers, acceptable values, required field status and more. Use the information in this page to either manually create your .XLSX import sheet or use a script/system to programmatically generate it.

If you need a legacy Excel template for versions of TeamDynamix before 10.3, please contact TeamDynamix Support.

Table of Contents

  1. Technical Details
  2. Intended Usage
  3. Requirements
  4. Processing Schedule
  5. Checking Job Statuses
  6. Updating Existing Records
  7. Creating New Records
  8. Duplicate Detection & Validation of Choice-Based Fields
  9. Not Mapping Values vs. Clearing Values
  10. Concerning Custom Attributes
  11. Import Sheet File Generation and Usage Help
  12. Common Errors
  13. Other Resources

Technical Details

General

This article corresponds with the TeamDynamix Web API services located at:

  • Production: https://yourTeamDynamixDomain/TDWebApi/api/people/import
  • Sandbox: https://yourTeamDynamixDomain/SBTDWebApi/api/people/import

For more information about the service itself, please do one of the following:

  • Check the documentation located in the page which referred you here
  • Check the documentation located https://yourTeamDynamixDomain/TDWebApi/Home/section/People
    From the People API documentation page, click the link for the POST /api/people/import endpoint.

The endpoint will accept up to a 4MB import file at this time. This is simply the largest file the endpoint will accept, not a guarantee that all records in the file will be processed in the import timeframe. That is entirely dependent upon the amount and type of data in the file.

Options

The following options are available as querystring arguments for the people import API endpoint:

  • AllowIsActiveChanges
    Data Type: Boolean
    Accepted Values: true or false.
    Example: AllowIsActiveChanges=true
    Whether or not to allow changes to user active status or not. This option only applies to updates for existing user records. If records have to be created, all created records will be set as active. Excluding this argument or leaving it blank will equate to a false value for this option.
     
  • AllowSecurityRoleChanges
    Data Type: Boolean
    Accepted Values: true or false.
    Example: AllowSecurityRoleChanges=true
    Whether or not to allow changes to user security roles. This option is specifically for updates to existing user records. Excluding this argument or leaving it blank will equate to a false value for this option.
     
  • AllowApplicationChanges
    Data Type: Boolean
    Accepted Values: true or false.
    Example: AllowApplicationChanges=true
    Whether or not to allow changes to application access. This option is specifically for updates to existing user records of type User. Excluding this argument or leaving it blank will equate to a falsevalue for this option.
     
  • NotifyEmailAddresses
    Data Type: String
    Example: NotifyEmailAddresses=user1@example.edu,user2@example.edu
    The email address(es) to notify after people imports are processed server-side. To specify multiple values, use a comma as the separator. Notifications will be sent out after processing regardless of whether or not the job is successful or not. Please note that this setting is not for notifying the specified addresses that an upload is posted. It is for after the import is processed by TeamDynamix.

An example POST URL utilizing all of the options might look like the following:
POST https://yourTeamDynamixDomain/TDWebApi/api/people/import?AllowIsActiveChanges=true&AllowSecurityRoleChanges=true&AllowApplicationChanges=true&NotifyEmailAddresses=user1@example.edu,user2@example.edu

Content Type for Posting Import File

When posting to the api/people/import endpoint, the post needs to be a multipart/form-data post where the request body simply contains the file being uploaded as a form part. For example, here are the headers and request body you might see if you tried this with a request composer tool like Fiddler.

Note that the code syntax in these examples is specific to Fiddler and will not work as-is in any scripting languages!

Fiddler Headers

Content-Type: multipart/form-data; boundary=-------------------------acebdf13572468
?User-Agent: Fiddler


Fiddler Request Body

---------------------------acebdf13572468
Content-Disposition: form-data; name="fieldNameHere"; filename="PeopleImportTemplate.xlsx"
Content-Type: application/vnd.openxmlformats-officedocument.spreadsheetml.sheet

<@INCLUDE *[full path to where the file is on the machine making the call to the API]\PeopleImportTemplate.xlsx*@>
---------------------------acebdf13572468--

 

Intended Usage

This process can be used to:

  • Update existing customer and user records
  • Create new customer records if no existing customer is found and the user type is Customer
  • Create new users if no existing user is found and the user type is User
  • Set / Change application access for user records
  • Activate / Deactivate user records
  • Set / Change security roles for user records

This process will not:

  • Convert existing Customer records to User records or vice-versa
  • Change Username, Authentication Username, Authentication Provider, Password, User Type or any other field referenced in the Admin Import Job Help page (or, for versions before 10.3, the template file from this KB) as "for creation only" when existing users are updated.

Requirements

  1. The file must be in .XLSX format
  2. The file must have a header row with supported TeamDynamix fields as column names.

Processing Schedule

The data contained in the files posted to this service is not processed in real-time. All import jobs are run in a batch process overnight. To decrease the amount of time for overall processing, import files may be processed in parallel. They are not guaranteed to be processed serially in the order they were submitted. You may see two or three files being processed at the same time. For this reason, if you submit multiple files to be processed, the data in those files should be unique across all files (as well as unique within the same file). Otherwise you may see errors with two files racing to create the same accounts and running into errors.

If a file does not finish in the processing windows described below, the file will need to be resubmitted, but in smaller chunks so as to allow the importer to work through the data. Therefore, submitting appropriately sized files is very important. 

Keep in mind that if a file is submitted later in the processing window, the time available to process it decreases the later it is submitted. When a processing window end time is reached:
  1. All files currently being processed will be cut off and marked as Timed Out.
  2. All files which are in the queue, but have not started processing yet, will be left in a Pending state, to be picked up in the next window.

For this reason, it is best to submit files as early as possible in the target processing window. Submitting them late risks files not being processed, or cut off mid-import.

Finally, there are not an infinite number of processors. TeamDynamix does it best to scale the processing system as needed (depending on number of clients/imports), but there are still a finite number of processors. It is possible your file could not be processed, especially if you submit it late in the accepted windows. While it should be rare for your file to be left in a Pending state, it is possible. If this happens, contact TeamDynamix support to see if there is a reason your file was never started.

Production

  • 3 AM ET - 7 AM ET
    The system will continuously accept and process imports submitted anytime between 3 AM ET and 7 AM ET, in addition to any queued items from before the processing window.

Sandbox

  • 3 AM ET - 7 AM ET
    The system will continuously accept and process imports submitted anytime between 3 AM ET and 7 AM ET, in addition to any queued items from before the processing window.
  • 9 AM ET
    This is an additional, one-time processing window for SaaS customers. All currently pending imports in the queue will be processed. As soon as the back-end processing system has completed all queued items it will stop and not process further imports until the next window.
  • 12 PM ET
    This is an additional, one-time processing window for SaaS customers. All currently pending imports in the queue will be processed. As soon as the back-end processing system has completed all queued items it will stop and not process further imports until the next window.
  • 3 PM ET
    This is an additional, one-time processing window for SaaS customers. All currently pending imports in the queue will be processed. As soon as the back-end processing system has completed all queued items it will stop and not process further imports until the next window.
  • 6 PM ET
    This is an additional, one-time processing window for SaaS customers. All currently pending imports in the queue will be processed. As soon as the back-end processing system has completed all queued items it will stop and not process further imports until the next window.

On-Prem (Installed) Clients

By default, on-premise customers only get the 3 AM - 7 AM local time processing windows configured on their production and sandbox setups. The schedules can be modified, added to or run manually on the on-premise application server(s).

TeamDynamix strongly recommends against running this processing during any time period where users may be in the software. User import processing is extremely resource-intensive. Running this during periods where users are in the system will make the applications unresponsive to end-users until processing has fully completed!

Checking Job Statuses

You may always check the status of import jobs in the TeamDynamix Admin application by navigating to Admin > Users > Import Jobs. The import jobs page will list out all jobs and their current status. Completed jobs may be drilled into to check for errors (if applicable). Pending jobs may similarly be drilled into to be canceled before they run.

Updating Existing Records

This process will attempt to match rows in the import file to existing users in the system. If existing records are found, they will be updated. If no existing records are found, new records will be created. The below matching logic is used to determine if a records should be updated or created.

  1. Username - If there already exists one or more people, regardless of active status, with the same TeamDynamix username as a row that is being imported, those records will be updated from the spreadsheet. This field will only be used for the purposes of matching. TeamDynamix username fields will not be updated as part of this process.
  2. Authentication Username - If there already exists one or more people, regardless of active status, with the same TeamDynamix authentication username as a row that is being imported, those records will be updated from the spreadsheet. This is the Auth Username specified for the user in the TDAdmin application. This field will only be used for the purposes of matching. TeamDynamix authentication username fields will not be updated as part of this process.
  3. Organizational ID - If there already exists one or more people, regardless of active status, with the same organizational ID as a row that is being imported, those records will be updated from the spreadsheet.
  4. Primary Email Address - If there already exists one or more people, regardless of active status, with the same primary email as a row that is being imported, those records will be updated from the spreadsheet.

User organizational IDs and primary email addresses will be updated as part of this process as long as they are not the fields which records are matched on.

Take special note of the Duplicate Detection & Validation of Choice-Based Fields section below for more information on how duplicate detection works!

Creating New Records

When creating new records, the default is to create a Customer. If you do not include the User Type column and specify a user type row value of User, Customer records will be created.

PLEASE NOTE: IF YOU SUBMIT FILES TO RUN DURING THE SAME NIGHTLY CYCLE, IT IS POSSIBLE FOR DUPLICATES TO BE CREATED. Therefore, to prevent this, do not submit multiple files during the same nightly cycle that have several duplicates within the file. 

Take special note of the Duplicate Detection & Validation of Choice-Based Fields section below for more information on how duplicate detection impacts user creation!

Duplicate Detection & Validation of Choice-Based Fields

Duplicate detection and validation for any field that is a choice-based field in the web interface happens against a snapshot of the system data at the time that an import starts. Organizations may have extremely large sets of data, making row-by-row duplicate detection and choice validation against live datasets simply too slow to process imports in a timely fashion. For this reason, when each import file starts processing, the processor for that file takes a snapshot of data for duplicate detection and choice field validation. This snapshot is what user duplicate detection checks against and what values for choice fields (ex: acct/dept or security role) are validated against for the entire import. Data snapshots are per file though and not reused across other imports.

This can particularly impact both duplicate detection columns and the Reports To Username column because as data is being created during the imports, that data is not added to the snapshot of duplicate detection data or choice field data. The only exception to this validation matching is when new user records are in fact being created. The user creation process has additional last minute and low-level username and authentication username uniqueness checks that are always applied and will bubble up as a row validation error.

For example, take an import sheet that has the following data in the following row order. In this example we will assume that all of this data is not in the system and would need to be created. It will also be assumed that the other required columns for the import sheet are filled out and are not listed out below for the sake of brevity.

  1. Type ID: 1, Username: user.a@school.edu
  2. Type ID: 1, Username: user.a@school.edu
  3. Type ID: 1, Username: user.b@school.edu, Reports To Username: user.a@school.edu
  4. Type ID: 2, Primary Email: user.c@school.edu, Organization ID: 0011
  5. Type ID: 2, Primary Email: user.c@school.edu, Organization ID: 0011

The results of this import would be as follows:

  • Row 1 would successfully create a user record with username user.a@school.edu. 
  • Row 2 would fail with an error that user.a@school.edu is not unique. This would actually pass initial duplicate checks on usernames and fail due to the aforementioned last minute checks on username or authentication username uniqueness. No record would be created.
  • Row 3 would fail with an error that user.a@school.edu is not a valid username for Reports To Username. Even though row 1 did successfully create a user record, Reports To Username is still a choice field. The snapshot data for choice field validation would not have any data from rows added earlier in the same import, and thus it is failed. No record would be created.
  • Row 4 would successfully create a customer record with primary email user.c@school.edu and organization ID 0011.
  • Row 5 would successfully create a second customer record with primary email user.c@school.edu organization ID 0011. Since the only duplicate detection fields available for customer records are primary email address and organization ID, and since the Row 5 data was completely unique and did not exist in the system at the time the import started, it would not be present in the snapshot data used for duplicate detection and thus a duplicate would not be detected.

For this reason, it is advised to not include any potentially duplicate user record data in any import file. You might end up with either several errors about username uniqueness or worse, multiple duplicate customer records. Similarly, you should not set the Reports To Username column values to usernames that also have to get created in the same import. These will always fail validation and not get imported.

Not Mapping Values vs. Clearing Values

It is important to note that this process will only attempt to modify values for fields which are included in the import sheet and can be mapped to a TeamDynamix field.

For instance, if you do not provide a column for a user's home phone, that value will not be mapped and thus will not be changed in any way.

If you want to clear values, be sure that you provide a column for the field you want to clear and leave the cell values blank.

Concerning Custom Attributes

The import does support custom attributes (though they are not included in the legacy template file because it cannot be specific to your organization). To update custom attributes fields, please modify your import file as follows:

  1. Refer to the TDAdmin Import Job Help Page (or, for versions before 10.3, open the Excel .XLSX template file).
  2. Enter the name of a custom attribute you wish to provide a value for after the last column with a value in row 1. This needs to be named and cased exactly as the attribute name is listed in the TeamDynamix Admin application. Again, use the attribute name, not the attribute header text!
  3. For each row in your new column, enter the desired value for that record. If the custom attribute has choices, enter the choice name exactly (including case) as it appears in the Choices section of the Attribute Detail page in the TeamDynamix Admin application. Note that the API only supports one choice per record, even if the custom attribute itself supports multiple choices.
  4. Repeat steps 2 and 3 as necessary for each custom attribute you want to include in the import.

Import Sheet File Generation and Usage Help

Versions 10.3 and Beyond

For all versions of TeamDynamix on or after 10.3, refer to the TDAdmin > Users & Roles > Import Jobs > Help page for available field help and usage details. The Import Job Help page will provide details on all available columns, how to format column headers, acceptable values, required field status and more. Use the information in this page to either manually create your .XLSX import sheet or use a script/system to programmatically generate it.

Versions 10.2 and Below (Legacy)

For versions 10.2 and below, request the appropriate import template sheet from TeamDynamix Support. Each column header in the template file has comments containing the accepted values and/or how to obtain those values. Columns marked as red are required to some degree and the column comments will indicate if it is always required or only required in certain cases. Some columns may only be required for creation and some columns may only impact records with a type of User.

For versions 10.2 and below, please use the import template as a basis for building your own file(s) to import. It is important to start with this because the template has all of the supported standard fields included and their proper column names. Any file which is generated by your processes needs to match column headers exactly, including case, for the standard fields. For instance, a column named User type would not be mapped to User Type because the casing is different. This might lead to customer records getting created instead of user records.

Regarding Passwords in Imports

The Password column is only required for TeamDynamix versions 9.4.0 and lower. In TeamDynamix 9.4.1 and up, the Password column is no longer required. The Password column only sets the user's TeamDynamix-based authentication password. If your organization uses LDAP, Active Directory or Single Sign On authentication, this value does not need to be stored/set in TeamDynamix and we highly recommend that you leave this value blank. Leaving the Password value blank will also significantly increase import processing speed when new records are created. This is because password encryption/hashing can be skipped because no value will be saved in TeamDynamix.

As a final note, if Password is left blank for user records and your organization uses only TeamDynamix-based authentication, these users will not be able to sign in until their password is reset in the TDAdmin application.

Other Notes

Please note that the User Type, Username, Password, Authentication Provider, and Security Role properties will not be changed when updating existing users. These values are only used for creation or for matching existing users.

Common Errors

Import Data Row Validation Errors

These are the errors you might see when going to Admin > Users & Roles > Import Jobs > [Job] to view a particular import job's details.

Many columns are required in the import sheet (marked as pink in the column header), and some columns are actually not required (ex: Reports To). Any error, irrespective of if it is a required column or not, will prevent that row from importing. If no value is provided for a required field (ex: Username), that row will error and and not be imported. Even for columns that are not required (ex: Reports To), if an erroneous value is provided, that row will not be imported. This applies to both rows which would create new records or update existing records.

The system will list a row number and then list one or more errors associated with that row. All row errors need to be corrected before that row can be successfully imported. 

Common errors for import jobs might look like this:

  • Row Number: 219, Name: Paul Bancroft, Email: , Message: 
    Username is required.
    Primary Email is required.
  • Row Number: 247, Name: Brandon Scutt, Email: bscutt@clientschool.edu, Message: 
    "Spanish TT" is not a valid choice for the "Acct/Dept" property.
  • Row Number: 2, Name: Don Carlson, Email: dcarlson8@clientschool.edu, Message: 
    "aschulz1@clientschool.edu" is not a valid choice for the "Reports To Username" property.
  • Row Number: 19, Name: Dale Quadros, Email: dquadros@clientschool.edu, Message: 
    Organization cannot be blank.

 

Object Reference Error Loading Worksheet

If you have sent a file with multiple worksheets in it, you may see an error like the following when the file is processed.

  • Object reference not set to an instance of an object.
    at PeopleImportFromFile.Program.GetStandardAndCustomAttributeMappings(List`1 eligibleMappings, ExcelWorksheet ws)
    at PeopleImportFromFile.Program.GetMappings(IServiceContext serviceContext, ExistingUserDictionaryCollection existingUsers, ExcelWorksheet ws)
    at PeopleImportFromFile.Program.ProcessImportTask(IServiceContext serviceContext, ImportTask currentTask)
    at PeopleImportFromFile.Program.Main(String[] args)

This means the processor picked the wrong worksheet to read data out of and found no column headers. For best results, only send files with a single worksheet of import data inside. This might entail manually removing instruction or other secondary helper sheets.

 

Invalid Excel File Errors

A common error in the import API is as follows:

Errors
The file is not an valid Package file. If the file is encrypted, please supply the password in the constructor.
at OfficeOpenXml.Packaging.ZipPackage..ctor(Stream stream)
at OfficeOpenXml.ExcelPackage.ConstructNewFile(String password)
at OfficeOpenXml.ExcelPackage..ctor(FileInfo newFile)
at PeopleImportFromFile.Program.ProcessImportTask(ImportTask currentTask, TDContext efcontext)
at PeopleImportFromFile.Program.Main(String[] args)

If you see this error, this means that the file you submitted to the API was not received as a valid Excel file. That doesn't necessarily mean that the file you are attempting to send isn't a valid .xlsx file (though the file must be in .xlsx format), it just means it was not valid or was corrupted in the API call. Many times this is due to the call not being well formatted as a multi-part form post. The call to this API method must be a multi-part form post and the body of the post must contain the file being sent to the API. We cannot provide specific examples of how to do this in every scripting language, but that is generally what this error means.

Or it might just be as simple as you have sent us a .xls file instead of a .xlsx file. If that is the case, you should only need to convert the file to .xlsx and resubmit!

 

Package is an OLE Compound Document or is Encrypted

A common error in the import API is as follows:

Errors:
Can not open the package. Package is an OLE compound document. If this is an encrypted package, please supply the password
at OfficeOpenXml.ExcelPackage.ConstructNewFile(String password)
at OfficeOpenXml.ExcelPackage..ctor(FileInfo newFile)
at PeopleImportFromFile.Program.ProcessImportTask(ImportTask currentTask, TDContext efcontext)
at PeopleImportFromFile.Program.Main(String[] args)

If you see this error, it means either one of two things:

  1. The file you submitted to the API was a .xls file with its file extension renamed to .xlsx. As stated previously, this will not work. The file must be in the Excel 2007 .XLSX format. Excel .xls files must be converted to .xlsx files (i.e. re-saved specifying .xlsxand cannot simply be renamed.
  2. The file is password protected. The TeamDynamix API does not support password protected .XLSX files. The file will need to be resubmitted with password protection removed.

 

Data is not Updating Correctly Due to Values Stored as Text

The import process is not able to read data incorrectly stored as text in Excel files. This typically impacts fields which are numbers. This will only come into play if your spreadsheet looks like the following when you open it in Excel:

All of the cells with a green indicator in the top-left corner have values which are numbers stored as text. These cells need to be converted to true Excel numbers so that the TeamDynamix import process can read them. Use the picture above to see how to fix the data. Once you fix one cell you can typically copy and paste that across the other cells.

If you leave your data in the state shown above, where Excel thinks that it is data stored as text, your imports may fail or not update data on the user records correctly.

Other Resources

Use the links below to help with generating .XLSX files from older .XLS files. For .CSV files, you may be able to simply rename them to .XLS then run them through these toolsets.

  1. The 2010 (and presumably latest) Microsoft Office Migration Planning Manager (OMPM) toolThis provides a toolset
    Note that this requires the Office Compatibility Pack above to be installed. to convert pre-Office 2007 file formats to their newer Office 2007+ equivalents.


    Be sure to look up Microsoft's documentation of 
    OMPM for more details on how to use it
90% helpful - 10 reviews

Details

Article ID: 4191
Created
Wed 3/11/15 3:52 PM
Modified
Thu 9/14/23 12:35 PM

Related Articles (3)

This article describes how to obtain and configure the TeamDynamix People Import utility to watch for and submit people import files to our Web API.