“A spork of Node.js with an open governance model”.
All that sounds very promising!
Things was moving really slowly lately, but now I'm sure that development will speed up.
Among the team members there are some of the finest Node.js superstars, including Ben Noordhuis (also famous for libuv), Isaac Schlueter (also famous for npm) and Fedor Indutny.
Some people are feeling concerned, that it will induce fragmentation in the community. IMHO there are wrong.
Firstly, io.js and Node.js will continue to share the same package registry. So fragmentation will only affect core development, not modules you install using npm.
Secondly, most of core developers have moved to io.js. That's some kind of signs. Something is wrong with the Joyent's governance. Without the drag of that corporation, devs gonna dev, and that's a good news.
We don't want a slow and ill governance like the PHP one, those issues should be settled as soon as possible.
So, I see only two options:
the happy end: reconciliation between Node.js & io.js will happen soon, the two projects will merge, dissidents will gain the power to put the governance model on the right track
the putsch: let's face it, io.js is better than Node.js in any aspect and the governance is open... so it will simply overtake Node.js, and everyone will make the switch very quickly (just like the switch from MySQL to MariaDB)
In any case, no fragmentation will occur... and server-side Javascript will enjoy a good nitro-boost.
Harmony
Harmony is the upgrade Javascript needed. It implements Ecma-Script 6 and it really moves the language forward.
Keyed collections (Map, Set, WeakMap, WeakSet) are very very interesting things.
On my previous post, detecting acyclic/cyclic graph in object data structure was done using an array of object, searching multiple references of the same object using .indexOf(). No doubt about it, on big graph this can easily become the bottleneck. Using a WeakMap could be so much faster: the object to check is the key.
Speaking of algorithmic complexity, searching a value is a O(n). Searching a key should be O(1) as far as I can tell (if it is a hashmap), or at least O(log n) if WeakMap use some kind of binary tree. Way faster...
Some cool new features in io.js:
default parameters
Promise
let, const
generators
template strings
Cool Harmony features that are not in io.js at the moment:
Proxy
Spread operator
Performances
No benchmark at the moment, but I ran the spaceship demo of the terminal-kit lib on my laptop... The script consumes 12-14% of CPU with Node.js, and consumes only 5-7% of CPU using io.js.
A shallow copy will clone the top-level object, but nested object are shared between the original and the clone. That's it: an object reference is copied, not cloned. So if the original object contains nested object, then the clone will not be a distinct entity.
A deep copy will recursively clone every objects it encounters. The clone and the original object will not share anything, so the clone will be a fully distinct entity.
Shallow copies are faster than deep copies.
When it is ok to share some data, you may use shallow copy. There are even use case where it is the best way to do the job. But whenever you need to clone a deep and complex data structure, a tree, you will have to perform deep copy. Have in mind that on really big tree, it can be an expensive operation.
How to perform a deep copy of an object in Javascript
We need to detect properties containing objects, and recursively call the deepCopy() function again.
Here is the result:
functionnaiveDeepCopy( original ){// First create an empty object with
// same prototype of our original source
var clone = Object.create( Object.getPrototypeOf( original ));var i , descriptor , keys = Object.getOwnPropertyNames( original );for( i =0; i < keys.length ; i ++){// Save the source's descriptor
descriptor = Object.getOwnPropertyDescriptor( original , keys[ i ]);if( descriptor.value &&typeof descriptor.value ==='object'){// If the value is an object, recursively deepCopy() it
descriptor.value =naiveDeepCopy( descriptor.value );}
Object.defineProperty( clone , keys[ i ], descriptor );}return clone ;}
By the way, if the property is a getter/setter, then descriptor.value will be undefined, so we won't perform recursion on them, and that's what we want. We actually don't care if the getter return an object or not.
There are still unsolved issues:
Circular references will produce a stack overflow
Some native objects like Date or Array do not work properly
Design pattern emulating private members using a closure's scope cannot be truly cloned (e.g. the revealing pattern)
What is this circular reference thing?
Let's look at that object:
var o ={
a:'a',
sub:{
b:'b'},
sub2:{
c:'c'}};
o.loop = o ;
o.sub.loop = o ;
o.subcopy = o.sub ;
o.sub.link = o.sub2 ;
o.sub2.link = o.sub ;
This object self-references itself.
That means that o.loop.a = 'Ha! implies that console.log( o.a ) outputs "Ha!" rather than "a". You remember how object assignment works? o and o.loop simply point to the same object.
However, the naiveDeepCopy() method above does not check that fact and therefore is doomed, iterating o.loop.loop.loop.loop.loop... forever.
That what is called a circular reference.
Even without the loop property, it happens that the original object want that the subcopy and sub properties point to the same object. Here, again, the naiveDeepCopy() method would produce two differents and independent clone.
A good clone method should be able to overcome that.
Copy
The closure's scope hell
Okey, let's examine this code:
functionmyConstructor(){var myPrivateVar ='secret';return{
myPublicVar:'public!',
getMyPrivateVar:function(){return myPrivateVar ;},setMyPrivateVar( value ){
myPrivateVar = value.toString();}};}var o =myContructor();
So... o is an object containing three properties, the first is a string, the two others are methods.
The methods are currently using a variable of the parent scope, in the scope of myConstructor(). That variable (named myPrivateVariable) is created when the constructor is called, however while it is not part of the contructed object in any way, it still remains used by those methods.
Therefore, if we try to clone the object, methods of both the original and the clone will still refer to the same parent's scope variable.
It would not be a problem if this was not a common Javascript's pattern to simulate private members...
As far as I know, there is no way to alter the scope of a closure, so this is a dead-end: pattern using the parent scope cannot be cloned correctly.
Next step: using a library
Okey, so far, we have done a good job hacking Javascript, and it was fun.
Now how about using a ready to use library?
The tree-kit library has a great clone() method, that works in most use case.
It happens that I'm actually the author of this lib, probably some kind of coincidence! ;)
clone( original , [circular] )
original Object the source object to clone
circular boolean (default to false) if true then circular references are checked and each identical objects are reconnected (referenced), if false then nested object are blindly cloned
It returns a clone of the original object.
How to use it? That's pretty straightforward:
first run the command npm install tree-kit --save into your project directory
then use it like this:
var tree =require('tree-kit');var myClone = tree.clone( myOriginal );
... where myOriginal is the object you want to clone.
Some optimization work have been done, so tree.clone() should be able to clone large structure efficiently.
One big step in optimization: removing recursivity in the algorithm – it's all taking place in a loop. It avoids stack-overflow and function's call overhead. As a side-effect, depth-first search has been replaced by a breadth-first search algorithm.
Great news: this method is able to detect circular references and reconnect them if the circular option is set to true! Oooh yeah!
The two variables object & copy reference the same object, so whatever the variable used to modify it, you will get the same result.
If you come from a C/C++ background, you should understand that object.a in Javascript should be translated into object->a in C/C++, it will help understand how copy = object works.
When it comes to object, a Javascript variable behaves more like a kind of automatic pointer.
Also there is a misleading saying commonly used in javascript, one may say that “Object are passed as reference”.
That's totally wrong.
If it was true, then the following code:
var object ={ a:1, b:2};functionfn( ob ){
ob ={ c:3, d:4};}fn( object );
console.log( object );
... would output { c: 3, d: 4 }, but actually object still reference { a: 1, b: 2 }.
So what happened really at function call?
Nothing unusual, each caller's argument are assigned to a callee's argument just like it would if you had manually used the = operator. There are no special case for object.
When you pass a variable by reference in a language that supports this pass by reference feature, the caller & callee variable are identical, as if they were each others aliases, so mutating one mutates the other.
Here in Javascript, we have two distinct variables, that happen to point to the same object... ... ... until re-assignment happens.
That's why I prefer to say that a variable, after an object assignment, behaves like a pointer to that object. In a C/C++ fashion, object = { a: 1, b: 2 } should be understood as object = &( { a: 1, b: 2 } ).
How to perform a shallow copy of an object in Javascript
Javascript does not have built-in object-cloning facilities.
A quick and dirty way to clone an object would be to create a new empty object, then iterate over the original to copy properties one by one.
This naive function will do the trick:
functionnaiveShallowCopy( original ){// First create an empty object
// that will receive copies of properties
var clone ={};var key ;for( key in original ){// copy each property into the clone
clone[ key ]= original[ key ];}return clone ;}
However, there are few issues with this code:
The clone produced doesn't have the same prototype than the original, it is simply an instance of Object... the prototype of the clone is not the same than the prototype of the original.
However, inherited properties of the original (inherited from its prototype) are copied into the clone as regular owned properties.
Only enumerable properties are copied.
Properties' descriptor are not copied, e.g. a read-only property in the original will be writable in the clone.
Finally: if a property is an object, then it will be shared between the clone and the original, their respective properties will point to the same object.
Two-handed calligraphy
The 5th point is what make it a shallow copy: only the surface of the object is cloned, deeper objects are shared.
A variant using Object.keys() can be used if we want to copy only owned and enumerable properties:
functionshallowCopyOfEnumerableOwnProperties( original ){// First create an empty object
// that will receive copies of properties
var clone ={};var i , keys = Object.keys( original );for( i =0; i < keys.length ; i ++){// copy each property into the clone
clone[ keys[ i ]]= original[ keys[ i ]];}return clone ;}
If you want to copy non-enumerable properties as well, you can replace Object.keys() with Object.getOwnPropertyNames():
functionshallowCopyOfOwnProperties( original ){// First create an empty object
// that will receive copies of properties
var clone ={};var i , keys = Object.getOwnPropertyNames( original );for( i =0; i < keys.length ; i ++){// copy each property into the clone
clone[ keys[ i ]]= original[ keys[ i ]];}return clone ;}
Still, non-enumerable properties will be enumerable properties in the clone...
functionshallowCopy( original ){// First create an empty object with
// same prototype of our original source
var clone = Object.create( Object.getPrototypeOf( original ));var i , keys = Object.getOwnPropertyNames( original );for( i =0; i < keys.length ; i ++){// copy each property into the clone
Object.defineProperty( clone , keys[ i ],
Object.getOwnPropertyDescriptor( original , keys[ i ]));}return clone ;}
Okey, this is far better.
Next time we will go further, we will see how to perform deep copy, and inspect issues that cannot be overcome easily.