So, why did Ruby creators had to use the concept of symbols in the language?
Well, they didn't strictly "have to", they chose to. Also, note that strictly speaking Symbols are not part of the language, they are part of the core library. They do have language-level literal syntax, but they would work just as well if you had to construct them by calling Symbol::new.
I ask from the perspective of a non-ruby programmer trying to understand it. I've learned lots of other languages and found in none of them the need to specify if I was dealing or not with what Ruby calls symbols.
You didn't say what those "lots of other languages" are, but here's just a small excerpt of languages that have a Symbol datatype like Ruby's:
- ECMAScript, also JavaScript
- Scala
- Scheme, Common Lisp, Clojure (also, Clojure has keywords), basically every language in the Lisp family, its successors and cousins has them
- Smalltalk, Newspeak, and many other languages in the Smalltalk family (which is likely where Ruby got them from), including Objective-C (albeit in limited form)
- Erlang (called atoms), also Elixir and LFE
- Julia
- Prolog (called atoms), which is likely where Erlang got them from
There are also other languages which provide the features of Symbols in a different form. In Java, for example, the features of Ruby's Strings are split into two (actually three) types: String and StringBuilder/StringBuffer. On the other hand, the features of Ruby's Symbol type are folded into the Java String type: Java Strings can be interned, literal strings and Strings which are the result of compile-time evaluated constant expressions are automatically interned, dynamically generated Strings can be interned by calling the String.intern method. An interned String in Java is exactly like a Symbol in Ruby, but it's not implemented as a separate type, it's just a different state that a Java String can be in. (Note: in earlier versions of Ruby, String#to_sym used to be called String#intern and that method still exists today as a legacy alias.)
The main question could be: Does the concept of symbols in Ruby exists as an performance intent over itself and other languages,
Symbols are first and foremost a datatype with specific semantics. These semantics also make it possible to implement some performant operations (e.g. fast O(1) equality testing), but that's not the main purpose.
or just something that is needed to exist because of the way the language is written?
Symbols are not needed in the Ruby language at all, Ruby would work just fine without them. They are purely a library feature. There is exactly one place in the language that is tied to Symbols: a def method definition expression evaluates to a Symbol denoting the name of the method that is being defined. However, that is a rather recent change, before that, the return value was simply left unspecified. MRI simply evaluated to nil, Rubinius evaluated to a Rubinius::CompiledMethod object, and so on. It would also be possible to evaluate to an UnboundMethod … or just a String.
Would a program in Ruby be lighter and/or faster than its, lets say, Python or Node counterpart? If so, would it be because of symbols?
I'm not sure what you are asking here. Performance is mostly a matter of implementation quality, not language. Plus, Node isn't even a language, it's an evented I/O framework for ECMAScript. Running an equivalent script on IronPython and MRI, IronPython is likely to be faster. Running an equivalent script on CPython and JRuby+Truffle, JRuby+Truffle is likely to be faster. This has nothing to do with Symbols but with the quality of the implementation: JRuby+Truffle has an aggressively optimizing compiler, plus the whole optimization machinery of a high-performance JVM, CPython is a simple interpreter.
Since one of Ruby's intent is to be easy to read and write for humans, couldn't its creators ease the process of coding by implementing those improvements in the interpreter itself (as it might be in other languages)?
No. Symbols are not a compiler optimization. They are a separate datatype with specific semantics. They are not like YARV's flonums, which are a private internal optimization for Floats. The situation is not the same as for Integer, Bignum and Fixnum, which should be an invisible private internal optimization detail, but unfortunately isn't. (This is finally going to be fixed in Ruby 2.4, which removes Fixnum and Bignum and leaves just Integer.)
Doing it the way Java does it, as a special state of normal Strings means that you always need to be wary about whether or not your Strings are in that special state and under which circumstances they are automatically in that special state and when not. That's a much higher burden than simply having a separate datatype.
Would there be a language-agnostic definition of Symbols and a reason to have them in other languages?
Symbol is a datatype that denotes the concept of name or label. Symbols are value objects, immutable, usually immediate (if the language distinguishes such a thing), stateless, and have no identity. Two Symbols wich are equal are also guaranteed to be identical, in other words, two Symbols which are equal are actually the same one Symbol. This means that value equality and reference equality are the same thing, and thus equality is efficient and O(1).
The reasons to have them in a language are really the same, independent of the language. Some languages rely more on them than others.
In the Lisp family, for example, there is no concept of "variable". Instead, you have Symbols associated to values.
In languages with reflective or introspective capabilities, Symbols are often used to denote the names of reflected entities in the reflection APIs, e.g. in Ruby, Object#methods, Object#singleton_methods, Object#public_methods, Object#protected_methods, and Object#public_methods return an Array of Symbols (although they could just as well return an Array of Methods). Object#public_send takes a Symbol denoting the name of the message to send as an argument (although it also accepts a String as well, Symbol is more semantically correct).
In ECMAScript, Symbols are a fundamental building block of making ECMAScript capability-safe in the future. They also play a big role in reflection.