If you are making many requests, it is worth it to have a look at requests.Sessionrequests.Session. This re-uses the connection to the server for subsequent requests (and also the headers and cookies).
I would also start separating your different responsibilities. One might be global constants (like your header and cookies), one a function that actually does the submission, one that creates the session and does each submission and then some code under a if __name__ == '__main__': guard that executes the code.
import requests
import json
COOKIES = {<redacted>}
HEADERS = {<redacted>}
def post(session, url, data):
"""Use session to post data to url.
Returns json object with execute result"""
response = session.post(url, data=data)
json_file = response.content.split('<ExecuteResult>')[1]
json_file = (json_file.split('</ExecuteResult>')[0]).replace("\\", "")
return json.loads(json_file)
def post_submisssions(url, data, n):
"""Create a session and post `n` submissions to `url` using `data`."""
session = requests.Session()
session.headers.update(HEADERS)
session.cookies = COOKIES
for _ in xrange(n):
json_response = post(session, url, data)
print "Created Id:", json_response['_entity']['Id']
if __name__ == "__main__":
url = "<redacted>"
data = "<redacted>"
post_submisssions(url, data, input("How many submissions to create:"))
I also added some rudimentary docstrings.
You could also change your parsing by using BeautifulSoup:
from bs4 import BeautifulSoup
def post(session, url, data):
"""Use session to post data to url.
Returns json object with execute result"""
response = session.post(url, data=data)
soup = BeautifulSoup(response.content, 'lxml')
return json.loads(soup.find('ExecuteResult'.lower()).replace("\\", ""))