2

I am using python pandas to execute query on MySql. In UI side using Flot API to represent the MySql data. Below is the existing implementation,

query2 = f"""select time, value from NWSTAT 
             where time > \'{from_date}\' 
             and time < \'{to_date}\'"""
result2 = pd.read_sql(query2, engine)
return result2.to_json(orient='records')

Getting result in below format

[{"time": 1581931200000, "value": 0.0}, {"time": 1581931200000, "value": 0.0}, 
 {"time": 1581931200000, "value": 0.0}, {"time": 1581931200000, "value": 0.0}]

From this response I am creating belwo structure for Flot API in UI Javascript side,

[[1581931200000,0],[1581931200000,0],[1581931200000,0],[1581931200000,0]]

Is there any way to do this in python side itself with out any iterations? Directly from query result.

Using Flask server. UI side: JQuery, Handlebar JS

EDIT: In accepted answer second approach takes lesser time.. Below is the time taken for both approach for 240k records

 First one: --- 1.6689300537109375e-06 seconds ---
 Second one: --- 0.5330650806427002 seconds ---

1 Answer 1

1

Problem is if convert both columns to numpy array format of integers is changed to floats.

print (json.dumps(result2.to_numpy().tolist()))

First idea is create lists from .values() of dictionaries and convert to json:

import json

query2 = f"""select time, value from NWSTAT 
             where time > \'{from_date}\' 
             and time < \'{to_date}\'"""
result2 = pd.read_sql(query2, engine)
return json.dumps([list(x.values()) for x in result2.to_dict(orient='records')])

Or change fomrat by DataFrame.to_dict with l for lists and then use zip with mapping lists, last convert to json:

import json

query2 = f"""select time, value from NWSTAT 
             where time > \'{from_date}\' 
             and time < \'{to_date}\'"""
result2 = pd.read_sql(query2, engine)
return json.dumps(list(map(list, zip(*result2.to_dict(orient='l').values()))))
Sign up to request clarification or add additional context in comments.

4 Comments

thanks for the updated solution, second one gives the same result in lesser time. Is there any option to handle NaN value?
@saravanakumar - Whats happen if NaNs? removed rows?
I handled it in mysql side!!!! select time, COALESCE(value,0) from NWSTAT .... This solves my problem... Thank you so much for you timely help!!!
Check the response time for 240K records ==first method --- 1.6689300537109375e-06 seconds --- ==Second method --- 0.5330650806427002 seconds ---

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.