The Wayback Machine - https://web.archive.org/web/20220420151225/https://github.com/topics/schema-registry
Skip to content
#

schema-registry

Here are 210 public repositories matching this topic...

mausch
mausch commented Apr 2, 2020

The title might seem a bit vague but I don't know how to describe it any better tbh :-)

Anyway this is what happened: I got some 500 responses from the schema registry and all I could see in the logs was :

[2020-04-02 16:03:35,048] INFO 100.96.14.58 - - [02/Apr/2020:16:03:34 +0000] "PUT /config/some-topic-value HTTP/1.1" 500 69  502 (io.confluent.rest-utils.requests)

The logs di

harikrishnakanchi
harikrishnakanchi commented Oct 28, 2021

In golang client, consumers get dynamic message instance after parsing. Add an example in the docs on how to use dynamic message instance to get values of different types in consumer code.

List of protobuf types to cover

  • timestamp
  • duration
  • bytes
  • message type
  • struct
  • map
good first issue docs
c-lair-ka
c-lair-ka commented Mar 22, 2021

Hi, I am trying to register a new schema following the example here: https://kafkajs.github.io/confluent-schema-registry/docs/usage-with-kafkajs
I keep getting undefined for SchemaType.
const { SchemaType } = require('@kafkajs/confluent-schema-registry') gives SchemaType is undefined.
I double checked that I have the latest version @kafkajs/confluent-schema-registry@2.0.0
What am I doing wr

bug good first issue
skovalyova
skovalyova commented Aug 10, 2021

Let's assume we have the following AVRO schema example-class.avsc:

{ "name": "Name", "namespace": "Avro.Namespace", "type": "record", "doc": "...", "fields": [ { "doc": "Description", "name": "some_field", "type": "string" } ] }

After calling the dotnet avro generate < example-class.avsc > ExampleClass.cs command, the namespace in t

enhancement good first issue

The main goal is to play with Kafka Connect and Streams. We have store-api that inserts/updates records in MySQL; Source connectors that monitor inserted/updated records in MySQL and push messages related to those changes to Kafka; Sink connectors that read messages from Kafka and insert documents in ES; Store-streams that listens for messages in Kafka, treats them using Kafka Streams and push new messages back to Kafka.

  • Updated Apr 4, 2022
  • Java

The goal of this project is to play with Kafka, Debezium and ksqlDB. For this, we have: research-service that inserts/updates/deletes records in MySQL; Source Connectors that monitor change of records in MySQL and push messages related to those changes to Kafka; Sink Connectors and kafka-research-consumer that listen messages from Kafka and insert/update documents in Elasticsearch; finally, ksqlDB-Server that listens to some topics in Kafka, does some joins and pushes new messages to new topics in Kafka.

  • Updated Apr 4, 2022
  • Java

Improve this page

Add a description, image, and links to the schema-registry topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the schema-registry topic, visit your repo's landing page and select "manage topics."

Learn more