When your web server has to make calls for so many scripts to run it can effect performance of your website, causing it to drag or delay loading key elements. When crawling a JS-heavy web page, Googlebot will first render the non-JS elements like HTML and CSS. The page then gets put into a queue and Googlebot will render and index the rest of the content when more resources are available.
Relying more on HTML and CSS can help
When it comes to crawling, indexing, and overall user experience its best to rely primarily on HTML and CSS. Because they degrade more gracefully over time they are more likely not to fail or ding your site when crawled.
For more info on the subject check out this video: