Timeline for Java takes 2 bytes to represent character?
Current License: CC BY-SA 4.0
8 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jul 9, 2018 at 3:09 | vote | accept | user3198603 | ||
| Jul 6, 2018 at 18:18 | review | Close votes | |||
| Jul 11, 2018 at 3:03 | |||||
| Jul 6, 2018 at 15:37 | history | edited | Deduplicator | CC BY-SA 4.0 |
copy-edited
|
| Jul 6, 2018 at 15:22 | comment | added | Berin Loritsch | @user3198603, that assumption has not been true for nearly 3 decades now. Don't forget that Asian nations like Japan and China have well over 256 characters. Most modern databases and programming languages have been using some form of UTF for a long time now. | |
| Jul 6, 2018 at 14:40 | answer | added | Simon B | timeline score: 7 | |
| Jul 6, 2018 at 14:38 | comment | added | user3198603 | You mean unicode characters ? | |
| Jul 6, 2018 at 14:37 | comment | added | Zavior | "1 byte can represent 2^8 = 256 Characters" Well what if you have more than 256 characters? | |
| Jul 6, 2018 at 14:31 | history | asked | user3198603 | CC BY-SA 4.0 |