Looks pretty good!
headings
I am sad the first line of output isn't
name,height,mass,hair_color,skin_color
That would make the resulting .CSV far more
valuable to programs reading it,
and to maintenance engineers who stumble upon
such data files a few months or years from now.
It's easy to make the data self-describing,
so you might as well do that.
On the plus side, "people.csv" is a nicely
descriptive filename.
looping over columns
To write out that header line you would have needed
the corresponding list of column names.
And then you could have iterated over it each time
you want to write a new row.
Notice that some APIs may omit e.g. the C-3PO hair_color
attribute rather than filling it in with n/a.
The nested loops I propose, over rows and columns,
would give you just a single place where you dereference
result attributes, a single place to insert a "paranoid"
check in case an attribute is missing.
pagination
The count attribute explains there are \$82\$ people to
query, so I would expect (with header) an 83-line result file.
You passed up the opportunity to query the next "?page=2"
URL to learn Palpatine's details.
end of loop
Notice that on "?page=9" the next attribute comes
back as null, signalling that e.g. a page 10 query would 404.
That is, a page 10 request would raise HTTPClientError,
specifically HTTPNotFound.
error handling
On which topic, this looks fine:
response = JSON.parse(Net::HTTP.get(uri))
If the .get() throws, we'll see an informative diagnostic,
and similarly if the .parse() doesn't work out.
You might consider breaking that nice concise one-liner
into a pair of lines using a temp var,
just so a potential stack trace will be more diagnostic.
That is, if only one thing happened on the offending line,
then a maintenance engineer will immediately be totally confident
that it was a network issue or it was some
badly formed JSON that came back.
We somewhat cavalierly dereference response["results"] and I
think that is Just Fine, as it will trigger a similarly
diagnostic error if it turns out the server's API changed
how it formats things.
What you wrote is fine as-is, but
if you were in a paranoid frame of mind, you might
possibly wish to verify that a
content-type: application/json
header came back, before attempting the JSON parse.