Skip to main content

Questions tagged [learning-theory]

Questions about the design and analysis of machine learning algorithms.

0 votes
0 answers
34 views

Exact learning model definition

The following is Definition 17 of Kearns and Vazirani's "Introduction to computational learning theory". We say that the representation class $C$ is efficiently exactly learnable from ...
user1868607's user avatar
  • 2,252
0 votes
0 answers
127 views

Open Problem: Structural Learnability of Pseudo-Random Boolean Circuits

I would like to propose an open problem at the intersection of computational complexity, pseudorandomness, and circuit theory. This problem has potential implications for cryptography, AI model ...
Konstant's user avatar
1 vote
0 answers
42 views

How Does the ELM Method Approximate Solutions to Linear ODEs Without Direct Training Data?

How Does the ELM Method Approximate Solutions to Linear ODEs Without Direct Training Data? Summary of the Problem I am working on solving a linear inhomogeneous ordinary differential equation (...
RIZWAN GULZAR MIR's user avatar
1 vote
0 answers
33 views

Conditions on LR in Gradient Descent

In Introductory Lectures in Convex Optimization by Yurii Nesterov, Section 1.2.3 shows that gradient descent is guaranteed to converge if the step size is chosen either with a fixed step size or ...
Kyle's user avatar
  • 11
0 votes
0 answers
112 views

Why can't we say that P=NP if we have an infinite text file with solution for every possible SAT combination?

I believe that I have a misunderstanding in the P=NP problem while I was thinking of how can it be proved in a non-constructive manner. We know that we can build an infinitely large text file with ...
TokieSan's user avatar
1 vote
1 answer
117 views

Why we need at most $2n$ examples to determine an axis aligned rectangle

In Ben-David & et al.'s Understanding Machine Learning, the authors wrote: Let $\mathcal{H}_n$ be the class of axis aligned rectangles in $\mathbb{R}^n$ , namely, $$ \mathcal{H}_n = \{h(a_1,\dots,...
Tran Khanh's user avatar
2 votes
0 answers
46 views

Representational power of Neural Neural Networks without a bias term

In a fully connected Neural Network, each perceptron has it's bias term $b$ which is learnt. Often (example, in Linear/ Logistic Regression), when we don't want to treat this bias term explicitly, we ...
Harry's user avatar
  • 21
2 votes
0 answers
194 views

Transductive Learning vs Inductive Learning in Machine Learning

Several recent research work has shown that transductive learning/inference outperforms inductive learning/inference during classification problems. This has been found in few-shot learning, other ...
Sandra's user avatar
  • 73
0 votes
1 answer
118 views

Question about the proof for the sample complexity of axis-aligned rectangles

The classical proof for the sample complexity of the hypothesis class of axis-aligned rectangles usually begins by stating that our $A(S) \subset R^*$, where $R^*$ is the target function. My only ...
Dragoș Constantin's user avatar
1 vote
0 answers
443 views

Infinite VC Dim not PAC learnable

This is usually shown by an application of the Statistical No Free Lunch Theorem. But is this possible to show this by working simply with the definition of PAC learnability and the sample complexity ...
JustBlaze's user avatar
1 vote
0 answers
67 views

Regarding constant * opt approximation in agnostic learning

In standard agnostic learning, we assume that there is a concept class $H\subseteq \{h:\{0,1\}^n\rightarrow \{0,1\}\}$. Given samples from a distribution $D:\{0,1\}^n\times \{0,1\}\rightarrow [0,1]$, ...
postasguest's user avatar
2 votes
0 answers
85 views

Multi-class sample complexity for PAC learning using "VC dimension"

VC dimension covers the binary classification case, i.e. when we want to get a predictor $X \to \{0, 1\}$. Using VC dimension, we can get the upper bound on the sample complexity for PAC-learning. In ...
Dmitry's user avatar
  • 346
1 vote
0 answers
70 views

Precise definition of Universal Learner in Machine Learning

It is surprising to me that I cannot find a precise definition of universal learner on the internet. I can guess what it should bebut I don't want to make a mistake, therefore I have come here. Here's ...
Suraj's user avatar
  • 11
2 votes
1 answer
118 views

Generalization error bound in case of collaborative learning

I am reading the paper "Collaborative PAC Learning" by Blum et al. So I will try to setup the problem here as to avoid reading the complete section (personalized setting). Assume there are $...
Naren's user avatar
  • 43
2 votes
1 answer
562 views

Understanding halving algorithm in online learning

I am working through "Understanding Machine Learning Theory" by Shai Shalev-Schwartz. In the chapter "Online learning" I came across the halving algorithm, the author uses the ...
Naren's user avatar
  • 43

15 30 50 per page
1
2 3 4 5
7