Back to Integrations
integrationDatabricks node
HTTP Request
integrationParseHub node
HTTP Request

Databricks and ParseHub integration

Save yourself the work of writing custom integrations for Databricks and ParseHub and use n8n instead. Build adaptable and scalable Analytics, and Development workflows that work with your technology stack. All within a building experience you will love.

How to connect Databricks and ParseHub

  • Step 1: Create a new workflow
  • Step 2: Add and configure nodes
  • Step 3: Connect
  • Step 4: Customize and extend your integration
  • Step 5: Test and activate your workflow

Step 1: Create a new workflow and add the first step

In n8n, click the "Add workflow" button in the Workflows tab to create a new workflow. Add the starting point – a trigger on when your workflow should run: an app event, a schedule, a webhook call, another workflow, an AI chat, or a manual trigger. Sometimes, the HTTP Request node might already serve as your starting point.

Databricks and ParseHub integration: Create a new workflow and add the first step

Step 2: Add and configure Databricks and ParseHub nodes using the HTTP Request nodes

Add the HTTP Request nodes onto your workflow canvas. Set credentials for Databricks and ParseHub as appropriate using generic authentication methods. The HTTP Request nodes make custom API calls to Databricks and ParseHub to query the data you need. Configure nodes one by one: input data on the left, parameters in the middle, and output data on the right.

Databricks and ParseHub integration: Add and configure Databricks and ParseHub nodes

Step 3: Connect Databricks and ParseHub

A connection establishes a link between Databricks and ParseHub (or vice versa) to route data through the workflow. Data flows from the output of one node to the input of another. You can have single or multiple connections for each node.

Databricks and ParseHub integration: Connect Databricks and ParseHub

Step 4: Customize and extend your Databricks and ParseHub integration

Use n8n's core nodes such as If, Split Out, Merge, and others to transform and manipulate data. Write custom JavaScript or Python in the Code node and run it as a step in your workflow. Connect Databricks and ParseHub with any of n8n’s 1000+ integrations, and incorporate advanced AI logic into your workflows.

Databricks and ParseHub integration: Customize and extend your Databricks and ParseHub integration

Step 5: Test and activate your Databricks and ParseHub workflow

Save and run the workflow to see if everything works as expected. Based on your configuration, data should flow from Databricks and ParseHub or vice versa. Easily debug your workflow: you can check past executions to isolate and fix the mistake. Once you've tested everything, make sure to save your workflow and activate it.

Databricks and ParseHub integration: Test and activate your Databricks and ParseHub workflow

Build your own Databricks and ParseHub integration

Create custom Databricks and ParseHub workflows by choosing triggers and actions. Nodes come with global operations and settings, as well as app-specific parameters that can be configured. You can also use the HTTP Request node to query data from any app or service with a REST API.

Supported API Endpoints for Databricks

To set up Databricks integration, add the HTTP Request node to your workflow canvas and authenticate it using a generic authentication method. The HTTP Request node makes custom API calls to Databricks to query the data you need using the API endpoint URLs you provide.

List clusters
Retrieve a list of all the clusters in your Databricks workspace.
GET
/api/clusters/list
Create cluster
Creates a cluster with the specified Databricks Runtime version and cluster node type.
POST
/api/clusters/create
Delete cluster
Permanently deletes a cluster from your Databricks workspace.
DELETE
/api/clusters/delete
Delete cluster
Permanently deletes the cluster with the specified cluster ID from the workspace.
DELETE
/api/v1/clusters/permanent_delete
Create cluster
Creates a new cluster in the Databricks workspace.
POST
/api/v1/clusters/create

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the Databricks official documentation to get a full list of all API endpoints and verify the scraped ones!

Create job
Creates a Databricks job that runs the specified notebook on the specified cluster.
POST
/api/v1/jobs/create

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the Databricks official documentation to get a full list of all API endpoints and verify the scraped ones!

Create directory
Creates an empty folder in a volume.
POST
/api/v1/files/create_directory
Upload file
Uploads a file to a volume.
POST
/api/v1/files/upload
List directory contents
Lists the contents of a volume.
GET
/api/v1/files/list_directory_contents
Delete file
Deletes a file from a volume.
DELETE
/api/v1/files/delete
Delete directory
Deletes a folder from a volume.
DELETE
/api/v1/files/delete_directory

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the Databricks official documentation to get a full list of all API endpoints and verify the scraped ones!

List groups
Lists the display names for all of the available groups within the Databricks account.
GET
/api/v1/groups/list

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the Databricks official documentation to get a full list of all API endpoints and verify the scraped ones!

Supported API Endpoints for ParseHub

To set up ParseHub integration, add the HTTP Request node to your workflow canvas and authenticate it using a generic authentication method. The HTTP Request node makes custom API calls to ParseHub to query the data you need using the API endpoint URLs you provide.

Get project
Retrieve details of a specific project.
GET
/v2/api/projects/get
Run project
Initiate a run for a specific project.
POST
/v2/api/projects/run
List all projects
Retrieve a list of all projects.
GET
/v2/api/projects/list
Get project
Retrieve details about a specific project using its token.
GET
/api/v2/projects/{PROJECT_TOKEN}
Run project
This will start running an instance of the project on the ParseHub cloud.
POST
/api/v2/projects/{PROJECT_TOKEN}/run
List projects
This gets a list of projects in your account
GET
/api/v2/projects
Get last ready data
Returns the data of the last ready run for a project.
GET
/api/v2/projects/{PROJECT_TOKEN}/last_ready_run/data

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the ParseHub official documentation to get a full list of all API endpoints and verify the scraped ones!

Get run
Retrieve details of a specific run.
GET
/v2/api/runs/get
Get data for run
Retrieve data for a specific run.
GET
/v2/api/runs/data
Get last ready data
Retrieve the last ready data from a run.
GET
/v2/api/runs/last_ready_data
Cancel run
Terminate a specific run.
POST
/v2/api/runs/cancel
Delete run
Permanently delete a specific run.
DELETE
/v2/api/runs/delete
Get run
Retrieve a specific run by its token
GET
/api/v2/runs/{RUN_TOKEN}
Get run data
Returns the data that was extracted by a run.
GET
/api/v2/runs/{RUN_TOKEN}/data
Cancel run
This cancels a run and changes its status to cancelled.
POST
/api/v2/runs/{RUN_TOKEN}/cancel
Delete run
This cancels a run if running, and deletes the run and its data.
DELETE
/api/v2/runs/{RUN_TOKEN}

These API endpoints were generated using n8n

n8n AI workflow transforms web scraping into an intelligent, AI-powered knowledge extraction system that uses vector embeddings to semantically analyze, chunk, store, and retrieve the most relevant API documentation from web pages. Remember to check the ParseHub official documentation to get a full list of all API endpoints and verify the scraped ones!

FAQs

  • Can Databricks connect with ParseHub?

  • Can I use Databricks’s API with n8n?

  • Can I use ParseHub’s API with n8n?

  • Is n8n secure for integrating Databricks and ParseHub?

  • How to get started with Databricks and ParseHub integration in n8n.io?

Looking to integrate Databricks and ParseHub in your company?

Over 3000 companies switch to n8n every single week

Why use n8n to integrate Databricks with ParseHub

Build complex workflows, really fast

Build complex workflows, really fast

Handle branching, merging and iteration easily.
Pause your workflow to wait for external events.

Code when you need it, UI when you don't

Simple debugging

Your data is displayed alongside your settings, making edge cases easy to track down.

Use templates to get started fast

Use 1000+ workflow templates available from our core team and our community.

Reuse your work

Copy and paste, easily import and export workflows.

Implement complex processes faster with n8n

red iconyellow iconred iconyellow icon