In our company we combine every Javascript file into one big (around 700kb but growing) minified and gzipped Javascript file. I am trying to assess the performance differences between using one big Javascript file for every page (minified and gzipped) and using several Javascript files, one for each page.
An obvious difference is that the big Javascript file can be cached by browsers after it has been loaded on the first page request and creates little overhead thereafter while when using several js files, there will be at least one uncached get request on each different page. So I would be trading a slower first initial page load for slower successive initial page loads.
In order to find out when the slow initial page load (using one big Javascript file) will become problematic enough to justify the work of breaking up the combined file into smaller files and changing our build procedure, I would like to find out how long it takes for the code to be parsed, so I can estimate the total loading and parsing time.
So far my approach has been to add a script tag to a test page which takes the current time, appending a bigish script with Javascript and after that meassuring time again like so:
var head = document.getElementsByTagName('head')[0];
var script = document.createElement('script')
script.setAttribute('type', 'text/javascript');
script.src = 'path/700kbCombineFile.js';
start_time = new Date().getTime();
head.appendChild(script);
At the end of 700kbCombineFile.js, I appended:
console.log(new Date().getTime() - start_time)
Then I subtract the network transfer time obtained from firebug and receive approximately 700ms for a 700 kb file and about 300ms for a 300 kb file.
Does this approach make sense? Why not? Is there a better way/any tools for doing this?