Timeline for How could I optimize an AJAX-based site by avoiding unnecessary/duplicate file-reads for each AJAX call?
Current License: CC BY-SA 4.0
7 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Apr 13, 2020 at 2:37 | vote | accept | mmseng | ||
| Apr 9, 2020 at 21:18 | answer | added | GHP | timeline score: 1 | |
| Apr 9, 2020 at 21:05 | review | Close votes | |||
| Apr 24, 2020 at 3:02 | |||||
| Apr 9, 2020 at 20:50 | comment | added | mmseng | @GregBurghardt I forgot to mention, the data in question WAS formerly stored in associative arrays, but I intentionally moved it out to CSV files so that it could be automatically updated on a schedule and/or manually updated by other staff who will most likely not have sufficient coding knowledge. Even if they did, diving into 7000+ lines of unfamiliar PHP and JS, just to add some data about a new printer model is what I was trying to get away from. Performance has not been an issue, so admittedly, this is somewhat of an academic question. | |
| Apr 9, 2020 at 20:45 | comment | added | Greg Burghardt | A) Have you measured a performance problem; B) Why not hard code the info as PHP associative arrays? | |
| Apr 9, 2020 at 20:45 | review | First posts | |||
| Apr 19, 2020 at 8:33 | |||||
| Apr 9, 2020 at 20:40 | history | asked | mmseng | CC BY-SA 4.0 |