Timeline for wget and curl saving web page as gibberish (encrypted?)
Current License: CC BY-SA 3.0
10 events
| when toggle format | what | by | license | comment | |
|---|---|---|---|---|---|
| Jun 9, 2017 at 10:51 | comment | added | Ned64 |
For me, wget saves plain HTML for the given URI. It is decompressed after transport. Do you possibly have some extra wget option set that causes it to save raw data?
|
|
| S Jun 9, 2017 at 10:47 | history | suggested | JB0x2D1 | CC BY-SA 3.0 |
fixed broken link
|
| Jun 9, 2017 at 10:36 | review | Suggested edits | |||
| S Jun 9, 2017 at 10:47 | |||||
| Jun 8, 2017 at 14:16 | comment | added | Yaron | @JB0x2D1 - updated my answer | |
| Jun 8, 2017 at 14:15 | history | edited | Yaron | CC BY-SA 3.0 |
added 387 characters in body
|
| Jun 8, 2017 at 14:03 | comment | added | JB0x2D1 | why would this page save compressed? | |
| Jun 8, 2017 at 13:47 | history | edited | Yaron | CC BY-SA 3.0 |
added 3 characters in body
|
| Jun 8, 2017 at 13:36 | vote | accept | JB0x2D1 | ||
| Jun 8, 2017 at 13:36 | vote | accept | JB0x2D1 | ||
| Jun 8, 2017 at 13:36 | |||||
| Jun 8, 2017 at 13:25 | history | answered | Yaron | CC BY-SA 3.0 |