This project is archived and is in readonly mode.

#914 ✓invalid

JSON decoding can be quicker

Reported by Barryvan | May 28th, 2010 @ 05:27 AM

Currently, when decoding a JSON string, simple string concatenation is used: eval('(' + string + ')'). This is OK, but for very large JSON strings, using array joining: eval(['(', string, ')'].join('') can be orders of magnitude faster. In some of the tests I ran, using array joining was around fifty times faster!

I've attached a patch for this behaviour to this ticket; it's really quite a trivial modification.

Comments and changes to this ticket

  • Arian

    Arian May 28th, 2010 @ 02:35 PM

    I've made a jsfiddle, and cannot confirm the speed improvement. (might take some time to run it 10000 times 2x ~8 seconds in Chrome)

    Can you give a link to show it's faster indeed?

  • Sebastian Markbåge

    Sebastian Markbåge May 28th, 2010 @ 05:00 PM

    I think the point is to handle very large string concatenation so the test would have to test decoding large strings.

    But I'm not getting any significant performance improvements (relative to the eval overhead).

  • Barryvan

    Barryvan May 30th, 2010 @ 01:33 AM

    Looks like this might just have been a glitch during my testing. According to, I can only get consistently better times with join() in Chrome; Fx3.7a4, Safari 4.0.5, Opera 10.5, and IE8 all post roughly the same times for both operations. (The IE9 platform preview failed to render the jsFiddle shell, then crashed. :/ )

    I'll do another round of testing when I'm back at work tomorrow; when I first noticed these results, I was using Fx3.6 stable in a VM, so it may just have been noise. Sorry!

  • Christoph Pojer

    Christoph Pojer May 30th, 2010 @ 02:02 PM

    • State changed from “new” to “invalid”

Create your profile

Help contribute to this project by taking a few moments to create your personal profile. Create your profile »

Shared Ticket Bins