Skip to main content
edited tags
Link
Toby Speight
  • 88.4k
  • 14
  • 104
  • 327
Tweeted twitter.com/StackCodeReview/status/1159434212367355904
Became Hot Network Question
mathjax
Source Link
dfhwze
  • 14.2k
  • 3
  • 40
  • 101

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in O(1)\$O(1)\$ at the cost of enqueue() in O(n)\$O(n)\$

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message for the rest of test cases.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as O(1)\$O(1)\$ and dequeue() as O(n)\$O(n)\$ but even that did not change anything.

What should I do to reduce time complexity of this code?

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in O(1) at the cost of enqueue() in O(n)

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message for the rest of test cases.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as O(1) and dequeue() as O(n) but even that did not change anything.

What should I do to reduce time complexity of this code?

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in \$O(1)\$ at the cost of enqueue() in \$O(n)\$

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message for the rest of test cases.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as \$O(1)\$ and dequeue() as \$O(n)\$ but even that did not change anything.

What should I do to reduce time complexity of this code?

Questions that fail some tests because of time limit exceeded are ok for code review.
Source Link
pacmaninbw
  • 26.1k
  • 13
  • 47
  • 114

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in O(1) at the cost of enqueue() in O(n)

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message and furtherfor the rest of test cases dont run.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as O(1) and dequeue() as O(n) but even that did not change anything.

What should I do to reduce time complexity of this code?

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in O(1) at the cost of enqueue() in O(n)

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message and further test cases dont run.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as O(1) and dequeue() as O(n) but even that did not change anything.

What should I do to reduce time complexity of this code?

Challenge

Problem Statement - Implement a Queue using two Stacks

I implemented dequeue() in O(1) at the cost of enqueue() in O(n)

#include <cmath>
#include <cstdio>
#include <vector>
#include <iostream>
#include <algorithm>
#include <stack>
using namespace std;

struct Queue{
    stack<int> s1,s2;

    int dequeue(){
        int x = s1.top();
        s1.pop();
        return x;
    }

    void peek(){
        cout<< s1.top()<<"\n";
    }

    void enqueue(int data){
        while(!s1.empty()){
            s2.push(s1.top());
            s1.pop();
        }

        s1.push(data);

        while(!s2.empty()){
            s1.push(s2.top());
            s2.pop();
        }
    }
};


int main() {
    int t; cin>>t;
    Queue q;

    while(t--){
        int k; cin>>k;
        switch(k){
            case 1:
                int x; cin>>x;
                q.enqueue(x);
                break;
            case 2:
                q.dequeue();
                break;
            case 3:
                q.peek();
        }
    }
    return 0;
}

There are 15 test cases. This code runs properly for the first 5 test cases then I get a 'Time Limit Exceeded' message for the rest of test cases.

This means the complexity of my program is too much.

So I tried reversing it, I made enqueue() as O(1) and dequeue() as O(n) but even that did not change anything.

What should I do to reduce time complexity of this code?

Source Link
KshitijV97
  • 405
  • 1
  • 5
  • 13
Loading