Using Kafka and Spark connectors
Neo4j has official support for Kafka and Spark connectors that can read and write data to graphs. The Kafka connector makes it easy to ingest data into Neo4j at scale, without needing to build custom client code. Spark connector simplifies the reading and writing of data to graphs using dataframes. Let’s take a look at the core features provided by these connectors:
- Kafka connector:
- Provides the capability to ingest data into Neo4j using templatized Cypher queries
- Can handle streaming data efficiently
- Runs as a plugin on existing Kafka installations
- You can read more about this connector at https://neo4j.com/labs/kafka/4.1/kafka-connect/
- Spark connector:
- Makes it easier to read nodes and relationships into a dataframe
- Makes it possible to take the data from dataframes and write it into Neo4j easily
- Supports using Python or R as the language of choice in Spark
- Makes it easier to leverage all the capabilities of Spark to massage the data before...