- San Francisco
- https://linkedin.com/in/tuliren
Block or Report
Block or report tuliren
Report abuse
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abusePinned
-
airbytehq/airbyte Public
Airbyte is an open-source EL(T) platform that helps you replicate your data in your warehouses, lakes and databases.
-
LiveRamp/jack Public
A set of scripts for generating fully functional Java database models from Ruby's ActiveRecord models and migrations.
-
-
-
-
1,070 contributions in the last year
Less
More
Activity overview
Contributed to
airbytehq/airbyte,
tuliren/economy-dynamics-errata,
airbytehq/json-avro-converter
and 5 other
repositories
Contribution activity
December 2021
Created 15 commits in 1 repository
Created 1 repository
- tuliren/nd064_course_1 Python
Created a pull request in airbytehq/airbyte that received 12 comments
🎉 Source Postgres: support all Postgres 14 types
What Support all Postgres types. Except two remaining issues that we need to confirm on the destination side if the solution works: #8903 #8902 …
+575
−370
•
12
comments
Opened 15 other pull requests in 2 repositories
airbytehq/airbyte
1
open
9
merged
2
closed
- Remove json avro schema converter hack
-
📑 Update source database performance test docs -
🐞 Destination S3 & GCS: remove excessive logging -
🐞 Destination e2e test: fix documentation url -
📝 Update json to avro conversion doc -
🎉 Testing destination: multiple logging modes - Bump the minor version for mysql source
- Add more tests for json avro logical type conversion
- Update specs and fix build
- Bump s3 version to remove excessive logging
- Fix avro logical type conversion test
- Increase max waiting time for fb async job
airbytehq/json-avro-converter
3
merged
Reviewed 26 pull requests in 2 repositories
airbytehq/airbyte
25 pull requests
- Remove json avro schema converter hack
- Airbyte 8278 all in one static code checker to be run locally as well as during ci pipelines
-
🎉 Source-postgres\mssql\mysql added a HEAP dump capturing on outOfMemory Error (if any) - improve db check looping
-
🐛 Destination S3: avro and parquet formats have issues with JsonToAvroSchemaConverter -
🎉 Source Postgres: support all Postgres 14 types - bump version for affected sources from #8749
- BigQuery/BiqQuery denorm Destinations : Add possibility to use different types of GCS files
- allow serializing to yaml without quoting strings
- implement new database config persistence
-
🎉 Setting CPU and Memory limits for performance tests via github -
🎉 Destination Redshift (copy): accept bucket path for staging data -
🐛 Jdbc sources: switch from "string" to "array" schema type for columns with JDBCType.ARRAY -
🎉 Source Mailchimp: updated Mailchimp schemas -
🎉 Source MySQL: support all MySQL 8.0 types - Source MySQL\MsSql\Postgres: added RDS base performance tests
- Added benchmarks scripts with small instruction
-
🐛 Source Hubspot: additionalProperties: true - move S3Config into destination-s3; update dependencies accordingly
- Redirect dbt log files to airbyte log when failing
- Destination Snowflake: Return Standard Loading
-
🎉 Source Iterable: Add email validation to list_users stream method execution -
🎉 Snowflake Destination internal staging support - Bump GCS version with avro/parquet timestamp conversion
-
🐛 Source-MySql: do not check cdc required param binlog_row_image for standard replication
airbytehq/json-avro-converter
1 pull request
Created an issue in airbytehq/airbyte that received 3 comments
Opened 20 other issues in 1 repository
airbytehq/airbyte
14
open
6
closed
- Run the source database benchmarks with various CPU and memory limits
- Postgres source should return timestamp with millisecond precision
- Postgres source date includes time, and cannot handle BC dates
- Postgres source returns null for Infinity, -Infinity or Nan
-
MySQL source cannot connect (
0.5.x) or sync (0.4.13) data - Prevent slow or failing discover calls for databases
- Reduce database data type logs
- Create a dev null destination on cloud for testing
- Convert ambiguous MySQL types to more specific ones
- Re-evaluate MySQL year type
- Create benchmarks using existing sample databases
- Destination deletion does not work on cloud
- Make sure the database benchmark test mimics the production
- Do not require retesting a connector when only the connector name is changed
- Document how to use SQL scripts to create benchmark databases
- Enable profiling for database benchmarks
- Control CPU and memory in benchmark database syncs
- Debug OOME issue in database benchmark
- Connector setup page does not show full field when the title is long
- Improve database usability by providing better feedback of invalid inputs
16
contributions
in private repositories
Dec 6 – Dec 27

