Directe Tellent Recruitee integratie voor Snowflake
Koppel je Tellent Recruitee-data direct aan Snowflake voor een centraal overzicht van je werving- en selectieproces. De connector ontsluit je ATS-data automatisch voor analyse in je eigen omgeving, zonder dat je zelf API-koppelingen hoeft te bouwen.
Deze connector is beschikbaar op aanvraag
Neem contact met ons op.
Wat kan de connector?
De Tellent Recruitee Connector biedt toegang tot de belangrijkste onderdelen van je recruitment-pijplijn
Kandidaatbeheer
Krijg diepgaand inzicht in tickets, taken, tijdregistraties en SLA-prestaties om je dienstverlening te verbeteren.
Vacatures & Locaties
Volg de volledige levenscyclus van je hardware, softwarelicenties en contracten direct in Snowflake.
Aanbiedingen & Conversie
Analyseer projectvoortgang, sprints, releases en wijzigingsverzoeken voor een gestroomlijnde IT-governance.
Geoptimaliseerde data-verwerking en veiligheid
Snelle configuratie
Slimme data-ingestie
Documentation
About Tellent Recruitee
A powerful, flexible ATS that brings your hiring team together in one place to streamline decisions, customize workflows, and stay compliant. Hire smarter and faster, YOUR way.
Get the Connector in Snowflake
The Tellent Recruitee connector is only available upon request. Contact us directly in order to obtain access to it.
Obtain Credentials and Create a Connection
To authenticate with the Tellent Recruitee API, you need two credentials: an API token and your Company ID. Refer to the official documentation to locate these.
Once you have both credentials, follow these steps to connect via the Nimbus Intelligence Connector:
- Enter your Company ID in the provided field on the connections page and click on "Create Connection".
- A pop-up will prompt you to generate an external access integration. Click Next.
- Enter your API token (secret) when prompted and click Next again.
After completing these steps, your Tellent Recruitee account should be successfully connected to your Snowflake environment.
Available Endpoints
Check the full API Documentation from Tellent Recruitee.
| Endpoint | Incremental Ingestion |
| Candidates | Yes |
| Departments | Not supported by Tellent Recruitee |
| Locations | Yes |
| Offers | Yes |
Need Help?
Did we forget to implement an endpoint that you would like or did you find a bug? If you didn't find your answer in the FAQs, let us know!
If you have an issue, check the known-limitations-and-issues section to see if we are already working on it!
Getting Started
How to install a Connector Application
- Sign in to your Snowflake account.
- Head to the Marketplace section and look for a DDBM Connector Application
- Press on the Get button and wait for the installation to complete
How to create a new Connection
- Open the installed application (see How to install a Connector Application above). You will land in the
Homepage. - Head to the
Connectionspage to create a newConnection. - Fill in the Form in the "Manage Connections" section.
- Click on the blue button to generate the pre-filled script.
- Copy and paste the script into a new Worksheet and execute it completely.
- Go back to the app and you will see the new connection appear under the "Current Connections" section.
You can select an endpoint and see if your new Connection has access to it. If you get an error on this point, it means one of the following:
- The credentials introduced are not valid. In this case, recreate the connection making sure there are no copy-paste errors or obtain a new set of credentials.
- The credentials introduced do not have access to this specific endpoint. This can probably be fixed by changing the scopes in the setting of the Connector (Personio, Hubspot...) Developer account.
How to create a new Configuration
- Head to the
Configurationspage to create a newConfiguration. - Select a previously created
Connection(see How to create a new Connection above). - Fill in the form, providing a configuration name of your choice, a target database/schema... (*)
- Press the Blue button and you will see the new
Configurationappear in the "Current Configurations" section after a few seconds.
With a Configuration created you can already fetch data, although we recommend creating Schedules (see How to create a new Schedule below).
You can also edit Configurations selecting them and changing the form.
(*) If the target database/schema already exist, check the How to use pre-existing databases and schemas below.
How to create a new Schedule
- Head to the
Schedulespage to craete a newSchedule. - Select a previously created
ConnectionandConfiguration(see How to create a new Connection/Configuration above). - Fill in the form, providing a schedule name of your choice, whether to use Full Refresh or Incremental jobs and the when to periodically run the job.
- Press the Blue button and you will see the new
Scheduleappear in the "Current Schedules" section after a few seconds.
With a Schedule created, you data will periodically be fetched on the background. You can check the history of past jobs in the Job History page.
How to use pre-existing Databases and Schemas
When creating a Configuration you will have to provide where the fetched data has to be saved in the end. New tables have to be created on a Database and Schema of your choice. If either of these already exist, the application will need some privileges to be able to create tables in them.
Pre-existing Database
Execute the following command:
grant usage, create schema on database <db_to_use> to application <app_name>;
Pre-existing Schema
Execute the previous command and the following one:
grant usage, create table on schema <db_to_use>.<schema_to_use> to application <app_name>;
Frequently Asked Questions
What is the pricing of your Connectors?
After a Free 7-day Trial, we will only charge:
- $0.10 per refresh
- $5 / per million rows ingested
As an example, let's say that you set up a scheduled task during the working days at 7:00 a.m. for your DDBM Exact Connector, and that during each refresh you ingest 2000 rows (only new and updated rows, thanks to our incremental ingestion). In that case, you would only pay $0.55 per week!
Besides that, our applications are built in the Native Snowflake Apps, and, since they run on your own Snowflake instance you will be charged by Snowflake for the warehouse compute cost.
How do I upload data into preexisting databases and schemas?
Configuring a connection to save data on preexisting databases or schemas will throw an error if the privileges were not properly granted to the application.
Solution
Execute the following commands
SQL
-- For preexisting databases grant usage on database <db_to_use> to application <application_name>; grant create schema on database <db_to_use> to application <application_name>; -- For preexisting databases and schemas grant usage on schema <db_to_use>.<schema_to_use> to application <application_name>; grant create table on schema <db_to_use>.<schema_to_use> to application <application_name>;
These commands will grant the application permission to see, use and create new tables on preexisting databases and schemas.
What happens if I set up wrong credentials or my task ingestion fails?
We will only charge for data that has actually been fetched and correctly ingested in your account. If all calls to the API return errors, then there will be no cost for that run.
If you have made a mistake with your credentials, simply delete the connection and create a new one with the correct credentials. Make sure to test the connection before setting up the scheduled task and your data will be ready in no time!
Incremental Ingestion vs. Incremental Fetching
- Incremental Ingestion: All our connectors have an incremental ingestion strategy implemented. When data is fetched from an API (Personio, Active Campaign, Hubspot...) we check the last time it was updated and compare it with the previously saved data in your Snowflake account. If the data in your account is already up to date, we discard the fetched data and you are not charged for it. Not all endpoints provide the information necessary for this check and, in those cases, we ingest the data.
- Incremental Fetching: Some APIs allow to fetch data given some date and time parameters. In those cases we can fetch only the updated data from the APIs, making the data ingestion process much faster and cheaper. The amount you will be charged is the same as in the Incremental Ingestion process, but it still be much cheaper because the time that the warehouses are running can be much shorter. Not all APIs provide the information necessary for this method and, in those cases, the Incremental Ingestion strategy applies.
Limitations and Known issues
Limitations
Only one task can be configured per connection
For now, only one task can be set up per connection.
Workaround
If you need different refreshing schedules for different endpoints, create different connections with the same credentials (but different endpoint selection, if you wish so) and configure different tasks for them.
Leaving the App / App becoming Idle
Leaving the app during data fetching using the "Fetch Data" button or letting the app idle may cause incomplete data loading.
Workaround
Stay in the app while fetching data or use tasks, which are not affected by this behavior.
Wrong Credentials
- Limitation: If you enter incorrect credentials in the connection script, the log table may incorrectly show success. This doesn't apply to OAuth authentication, which will report an error.
- Workaround: Always provide the correct credentials and use the "Test Connection" button to verify before fetching data or setting tasks.
Deze connector is beschikbaar op aanvraag
Neem contact met ons op.
