Navigate back to the homepage

Getting Started with Kafka Connect for New Relic

Gavin Johnson, Rahul Bhattacharya
October 9th, 2020 · 4 min read

This post was originally published on Confluent’s blog as part of a small go-to-market for the Kafka Connector for New Relic. It was republished on the New Relic blog. I co-wrote this post with Rahul Bhattacharya from Confluent.

It’s 3:00 am and PagerDuty keeps firing you alerts about your application being down. You need to figure out what the issue is, if it’s impacting users, and resolve it quickly so your users aren’t interrupted and you can get back to sleep. A lot of people rely on New Relic to help with this.

Traditionally, the telemetry data that is stored in New Relic’s Telemetry Data Platform has come from New Relic’s own instrumentation. More and more, that isn’t the case, with open source tools and alternative instrumentation sending data to the Telemetry Data Platform. Now, it’s easier than ever to build these observability pipelines with the New Relic connector for Kafka Connect, available both on the Confluent Hub and open source on GitHub.

The below tutorial details how to set up the connector for both events and metrics and provides simplified examples of sending data in both data types to New Relic through the connector. New Relic without Kafka Connect

Alt Text

New Relic Connector for Kafka Connect

The New Relic connector for Kafka Connect allows you to ingest data from Apache Kafka® topics into the New Relic platform without writing a single line of code. Because we’re working with Kafka Connect, the connector is purely driven by configuration, which you apply using the Kafka Connect REST API. Things like retries, error handling, and dead letter queues are handled for you by Kafka Connect and can be customized as required. Achieving this with a Kafka consumer would require many hours of development time and testing.

Currently, the connector supports the Metric and Event telemetry data types. We are working on releasing code for the remaining data types (Log and Trace) soon.

To get started, head over to Confluent Hub and follow the instructions to install the connector into your local Kafka cluster. After that, you can use the Kafka Connect REST API to launch a connector. This will require an API key from New Relic to start sending data.

The connector takes care of all the error handling for you—if messages do not follow the recommended format, you can pipe them to a dead letter queue (DLQ) and also decide whether you want to abort processing if a bad message is read, or continue regardless. This blog post explains these options in more detail.

Runtime errors like connectivity issues, the wrong API key, or too large of payload are handled in different ways:

  • The wrong API key will shut down the connector task immediately.
  • If there is a network error, the connector will retry up to a user-configured limit.
  • If the payload is too large, the connector splits and retries until it sends everything over. From a user perspective, you don’t need to worry about this.

Let’s look now in more detail at how to set up an Event or Metric connector.


The Event API is supported by New Relic. To create an Event connector, you can use the sample JSON to configure the connector in Kafka Connect:

2 "name": "events-connector",
3 "config": {
4 "connector.class": "",
5 "value.converter": "",
6 "topics": "nrevents",
7 "api.key": ""
8 }

The above JSON will create a connector with the name events-connector. It will start sinking messages from a topic called nrevents. A sample Event message looks something like this (note that the timestamp field in the payload needs to be the Unix epoch in milliseconds):

2 {
3 "eventType":"Purchase",
4 "account":3,
5 "Amount":259.54,
6 "timestamp":<CURRENT_TIMESTAMP>
7 },
8 {
9 "eventType":"Purchase",
10 "account":5,
11 "amount":12309,
12 "product":"Item",
13 "timestamp":<CURRENT_TIMESTAMP>
14 }

The above message creates two events in New Relic of the type Purchase. Make sure you use the current timestamp. You can list them in New Relic One’s Query Builder by running a query like SELECT * from Purchase.

Alt Text

Your messages in the topic should adhere to this format. For an event, eventType and timestamp are required fields.The EventsConverter is responsible for validating these messages and will flag them as errors if they do not comply with the required format.


The details of the Metric API can be found in the documentation. Metrics are broadly classified into the three following categories:

  1. Gauge: Examples include temperature, CPU usage, and memory.
  2. Count: Examples include cache hits per reporting interval and the number of threads created per reporting interval.
  3. Summary: Examples include transaction count/durations and queue count/ durations.

This is the sample JSON for creating a Metric connector:

2 "name": "metrics-connector",
3 "config": {
4 "connector.class": "",
5 "value.converter":"com.newrelic.telemetry.metrics.MetricsConverter",
6 "value.converter.schemas.enable": false,
7 "topics": "nrmetrics",
8 "api.key": ""
9 }

The above configuration will create a connector named metrics-connector, which will ingest metrics from the nrmetrics topic. As mentioned above, you will need to replace the <NEW_RELIC_API_KEY> with the API key that you got from New Relic. This sample message creates three different metrics of the type Count, Gauge, and Summary. Make sure you use the current timestamp.

2 {
3 "metrics": [
4 {
5 "name": "cache.misses",
6 "type": "count",
7 "value": 15,
8 "timestamp": <CURRENT_TIMESTAMP>,
9 "": 10000,
10 "attributes": {
11 "": "myCache",
12 "": ""
13 }
14 },
15 {
16 "name": "temperature",
17 "type": "gauge",
18 "value": 15,
19 "timestamp": <CURRENT_TIMESTAMP>,
20 "attributes": {
21 "city": "Portland",
22 "state": "Oregon"
23 }
24 },
25 {
26 "name": "service.response.duration",
27 "type": "summary",
28 "value": {
29 "count": 5,
30 "sum": 0.004382655,
31 "min": 0.0005093,
32 "max": 0.001708826
33 },
34 "": 10000,
35 "timestamp": <CURRENT_TIMESTAMP>,
36 "attributes": {
37 "": "",
38 "": "foo"
39 }
40 }
41 ]
42 }

After you add the message to the Kafka topic, head over to New Relic One’s Query Builder and run a query like SELECT * from Metric. You should see the three metrics above created.

Alt Text

The required fields in a Metric are explained in the documentation. Also, note that you can control how many times you want the connector tasks to retry in the case of network issues using these two extra fields in your JSON config: nr.max.retries (maximum number of retries, default 5) and (interval between retries in milliseconds, default 1,000).

Next steps

Interested in learning more about Kafka Connect? Download the Kafka Connect New Relic Connector to get started! You can also check out Confluent Cloud, a fully managed event streaming service based on Apache Kafka. Use the promo code CL60BLOG to get an additional $60 of free Confluent Cloud.*

If you’re not already a New Relic user, there’s no better time to get started. You can send 100 GB per month to the Telemetry Data Platform and enjoy complete access to all of New Relic—free forever.

More articles from thtmnisamnstr

50 Bucks + Tux: Turning a cheap Atom 2-in-1 into a Linux laptop

I took a cheap Windows tablet with a keyboard case and turned it into a Linux laptop.

October 5th, 2020 · 8 min read

Building a Portfolio/Resume Site with Gatsby, Part 4 - Automating deploys to Firebase with GitHub Actions

This is after your site is good to go and already and you want to eliminate some toil. This shows how to create GitHub Actions to build and deploy your Gastby site.

August 17th, 2020 · 2 min read
© 2020 thtmnisamnstr
Link to $ to $ to $ to $ to $