The Wayback Machine - https://web.archive.org/web/20220105072351/https://github.com/topics/mxnet
Skip to content
#

mxnet

Here are 592 public repositories matching this topic...

DNXie
DNXie commented Aug 24, 2020

Description

This is a documentation bug. The parameter of API mxnet.test_utils.check_numeric_gradient is not consistent between signature and Parameter section. There is a parameter check_eps in the Parameter section, but it is not in the signature.

Link to document: https://mxnet.apache.org/versions/1.6/api/python/docs/api/mxnet/test_utils/index.html#mxnet.test_utils.check_numeric_gra

chan4cc
chan4cc commented Apr 26, 2021

New Operator

Describe the operator

Why is this operator necessary? What does it accomplish?

This is a frequently used operator in tensorflow/keras

Can this operator be constructed using existing onnx operators?

If so, why not add it as a function?

I don't know.

Is this operator used by any model currently? Which one?

Are you willing to contribute it?

gluon-cv
JiaMingLin
JiaMingLin commented Aug 3, 2021

Hi,
I need to download the something-to-something and jester datasets. But the 20bn website "https://20bn.com" are not available for weeks, the error message is "503 Service Temporarily Unavailable".

I have already downloaded the video data of something-to-something v2, and I need the label dataset. For the Jester, I need both video and label data. Can someone share me the

willsmithorg
willsmithorg commented Dec 26, 2021

Could FeatureTools be implemented as an automated preprocessor to Autogluon, adding the ability to handle multi-entity problems (i.e. Data split across multiple normalised database tables)? So if you supply Autogluon with a list of Dataframes instead of a single Dataframe it would first invoke FeatureTools:

  • take the multiple Dataframes (entities) and try to auto-infer the relationship betwee
gluon-ts
gluon-nlp
preeyank5
preeyank5 commented Dec 3, 2020

Description

While using tokenizers.create with the model and vocab file for a custom corpus, the code throws an error and is not able to generate the BERT vocab file

Error Message

ValueError: Mismatch vocabulary! All special tokens specified must be control tokens in the sentencepiece vocabulary.

To Reproduce

from gluonnlp.data import tokenizers
tokenizers.create('spm', model_p

Improve this page

Add a description, image, and links to the mxnet topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the mxnet topic, visit your repo's landing page and select "manage topics."

Learn more