The Wayback Machine - https://web.archive.org/web/20220709132843/https://github.com/topics/dataops
Skip to content
#

DataOps

DataOps is an automated, process-oriented methodology, used by analytic and data teams, to improve the quality and reduce the cycle time of data analytics. While DataOps began as a set of best practices, it has now matured to become a new and independent approach to data analytics. DataOps applies to the entire data lifecycle from data preparation to reporting, and recognizes the interconnected nature of the data analytics team and information technology operations.

Here are 113 public repositories matching this topic...

flyte
rubrix
frascuchon
frascuchon commented Oct 18, 2021

The default RubrixLogHTTPMiddleware record mapper for token classification expect a structured including a text field for inputs. This could make prediction model inputs a bit cumbersome. Default mapper could accepts also flat strings as inputs:

def token_classification_mapper(inputs, outputs):
    i
good first issue help wanted

Streaming reference architecture for ETL with Kafka and Kafka-Connect. You can find more on http://lenses.io on how we provide a unified solution to manage your connectors, most advanced SQL engine for Kafka and Kafka Streams, cluster monitoring and alerting, and more.

  • Updated Jun 20, 2022
  • Scala
StewartJingga
StewartJingga commented Jul 6, 2022

Is your feature request related to a problem? Please describe.
I was trying to store extra group information with below details

[PUT] /v1beta1/groups/{id}
{
  "name": "My Group",
  "slug": "my-group",
  "orgId": "{ORG_ID}",
  "metadata": {
    "description": "my-group-description"
    "is_active": true
  }
}

But shield responded with

Status: 400 Bad Request
{
   
good first issue
tozka
tozka commented Apr 11, 2022

What is the feature request? What problem does it solve?

As employees leave the organization/company or users change mails , eventually the notification list configured for the job would start containing a lot of invalid mails. This causes issues with SMTP relay (e.g postfix) which could be buffering all invalid requests until the queu is full, which cause all mails coming for all jobs to b

enhancement good first issue
harikrishnakanchi
harikrishnakanchi commented Oct 28, 2021

In golang client, consumers get dynamic message instance after parsing. Add an example in the docs on how to use dynamic message instance to get values of different types in consumer code.

List of protobuf types to cover

  • timestamp
  • duration
  • bytes
  • message type
  • struct
  • map
good first issue docs
meteor
ravisuhag
ravisuhag commented Jun 28, 2021

Deliverables

  • add unit tests
  • add extractor
  • add README.md in plugins/extractors/neo4j, defining output
  • register your extractor plugins/extractors/populate.go
  • add extractor the extractor list in docs/reference/extractor.md

Output must contain a Table

Explore the Table Data Model and add as many features as possible.

Table

| Fi

good first issue extractor
Wikipedia
Wikipedia

Related Topics

open-data