I’ve started working a couple of weeks ago in a refactor of the whole WebRTC Explorer project. Since I started talking about it, I’ve received a lot of feedback and interest on the module, with people requesting new features or simple tinkering with it for their own projects. This is what I believe all one could wish for an open source project, so I thank you!
Quick intro to WebRTC Explorer
If you are new to WebRTC Explorer, it is essentially bringing Packet Switching to the Application Level, using WebRTC Data Channels as the transportation layer between nodes.
WebRTC enables communication between browsers without needing mediators (servers), enabling us, the users, to route packets between machines, using the Web technologies only.
WebRTC is inspired by the Chord routing scheme to create routing tables that are evenly balanced accross nodes.
I’ve explained in more detail my main motivations at OPOJS, you can watch my talk below:
Although it was done while I was working on the version 1.0.0, the goals and core mechanisms are still the same today.
The second iteration of WebRTC Explorer is all about robustness (speed) and developer UX (making it easy to use WebRTC Explorer as a drop in replacement for other transports) and since we are it, add features that have been requested for a while, such as signalling through the Chord Routing, instead of only through the signalling server.
The first update: I’ve decided to make a logo for the project, woot! It is really simple and even though it has no technical advantage whatsoever, it makes me feel that the repo looks better and organized. Check it at the top of this blog post or at the repo.
Overall code revamp
The entire codebase was refactored (https://github.com/diasdavid/webrtc-explorer), dependencies were updated, codestyle was migrated to standard, and the signalling server is now part of the webrtc-explorer repo, quickly accessible as a CLI tool once the module is installed globally.
> npm i webrtc-explorer -g
Signalling Server Started
Complete API redesign
The interfaced moved from a message oriented (‘send, receive’) to a Node.js net.Socket like interfaces, so it can be compatible with the interfaces specified by interface-transport and interface-connection, making it easy to drop in webrtc-explorer as a replacement for other transports.
Two signalling modes
One of the most requested features, was the ability to exchange signalling data through the Routing Overlay. This is WIP in progress, but something that will be part of the final 2.0.0 release.
Select the fingers to use from the node itself
Previously, the setup required to specify how many fingers should each node have in advance, now, with the new call explorer.updateFinger(<row>), each node can select which rows from its finger table to update and only creating the channels to those. The only requirement remains that every node has at least to have a finger to its successor.
More complete and yet still growing API and Architecture documentation, at the README of the project.
piri-piri, the P2P browser testing framework was updated to use electron shells instead of full browsers, making it faster. Also, webrtc-explorer were rewritten, a bunch were added and the test base will continue to grow.
Some of the attendees of ArcticJS getting completely blown away with the Northen Lights
tl;dr; From my current perspective, one of the ways we can cause greater growth, is by making Node.js part of the CS degree curriculums and the tool of choice for students and researchers to develop their projects. This blog post is a collection of notes from my experience as a Student, a Researcher and now as a Professor at the University of Lisbon, which has created a course from the ground up for Modern Web Development. You can also watch the talk on video below:
As a student
Since I entered college, literally, since I finished the first semester, I started being bombarded with job offers from all kinds of companies based in Portugal. It was good to known that I had picked a degree that would open my doors to so many career options, but little did I know how anything worked. i
Later, I started noticing that the majority of this job offers were to work in established technologies (also known as something that is not particularly exciting anymore), such as PHP or Java and the majority of them were for ‘consulting’. I thought several times to myself, looking at was exciting on the Internet, that these companies were clearly ‘old fashioned’ by not having projects with tech that all the ‘cool kids’ were talking about.
Time passed, I became very interested by what was happening on the Node.js and something struck my attention, with the explosive Node.js growth rate, a ton of new jobs suddenly appeared and reading the case stories from the companies that moved their stacks from some other language to Node.js, one of the great reasons presented was that they would be able to hire great new talent for their teams. This might not seem relevant for small companies, but when you need to increase the size of your team to hundreds of thousands of developers, the talent pool available might have the same weight on the decision to pick the technology as the quality and features that technology could bring to your products.
This becomes a cycle, thousands of students graduate every year from CS degrees, and where the CS degree is fully focused on Java and Web Services with SOAP for the most part, that is the talent pool that will be available. I don’t have any numbers, but evaluating by the sample I have from my University classmates and their peers, there is a significant number of graduates that 4 years after finishing their degree, are still working on the same technology and environment they learned during their degrees.
This isn’t necessarily a bad thing, but makes the point that what gets taught at the Universities, is in some part what is going to drive the technology decisions in future product generations.
Stepping into Research land (do my M.Sc.)
When I got to my M.Sc. I had already a job, it was tough doing the two things at the same time (+ LXJS, Startup Scholarship and other projects), but doable and very rewarding. What this enabled me was to be able to very picky on what was going to be my M.Sc. thesis, because I didn’t have a ‘picking something and get it done quick, because I was burned out of college’ mentality, like other graduate students do. The M.Sc. was definitely very valuable for me.
I went and I ‘interviewed’ a bunch of professors and I came with my own idea, build a P2P Distributed Computing platform, using the Web Platform and WebRTC. Some of them said no, after all, why would they mentor me in some work that was not relevant to their research, but eventually I met the right fit and I got both a great mentorship to do a thesis plus all the freedom to research what I wanted, it was great, but then, the fun begun.
The academic world is undoubtedly very intriguing, it works mostly around reputation and researchers are typically very protective of their ideas, which was essentially the opposite of what I have been experience on the Open Source ecosystem. When I explained that I was making everything open source from the beginning, I was told that ‘I could not do that’, my obvious reaction to this affirmation was ‘Why?’, in which the answer was ‘What if someone publishes the work before you?’. I guess it is a valid point, but I wasn’t honestly concerned with that, I was frustrated with the fact that there was so much researched published and that I could not test myself because the code was not public.
In the end, everything went really well, because I did it all opensource, I got invited to give talks about it, a lot of people started reviewing it and using it for their own research in other Universities, from grad students to PhD candidates, even before I delivered the Thesis! It was very exciting (The traditional cycle would be: 1) Deliver the Thesis, 2) Submit and wait to get approved at some conference to publish a paper, 3) Hope that someone notices the paper and references it).
Sitting on the other side, notes from becoming a Professor
Something must have gone right, because now I was not only having the opportunity to teach (which I secretly wished for a while) and to be the one making the calls on what and how it was going to be delivered. I decided to have 3 main foundations:
a) Use Node.js
b) Everything has to be open source
c) Labs had to use the nodeschool format
a) Use Node.js
It had to be, not only for what Node.js is, but for the inspiration that the community has to offer, from science, music, robots, conferences, it is a world of fantastic people and if I was going to teach a class, the first thing I wanted to make sure, was to open the door to this ecosystem.
Funny fact: I got asked why I added ‘npm’ to the course curriculum and the question was a bit like this: “Do you really have to teach ‘npm’? Isn’t that just the package manager? Can’t they use Node.js without it?”, my answer was an exciting “of course it has to be part of the of the course, Node.js is just not about the technology, it is also about the ecosystem”. In that moment I remembered how no one is really incentivized to use any package manager during the other courses, everything is built from scratch and looking at your neighbor code was called cheating.
b) Everything has to be open source
Following point a), one of the things I wanted was to make it so students would develop their projects in the open by using github as the host. This was an incentive to feel ok at looking at others people code and learn from them, as long as they would make the effort to understand and not just copy paste, the level of the quality of the output could only increase.
My honest goal was to make this more of a standard practice amongst students, so that the next time someone wants to do research for their M.Sc or PhD, doing it open source is just natural.
c) Labs had to use the nodeschool format
I love running NodeSchool Lisbon and helping out in other NodeSchool events across the globe when I’m around, it is fun, people love it, I’ve seen from beginners to advanced level and everyone always takes something from it, so there was no reason not to use the same format.
I received several positive comments from the students, they seem to love it, in fact, during these lab classes (with 2:30h of length), no one wanted to take the mandatory class break, some of them even stayed 30 to 45 minutes more to continue learning. One student asked if I could give all of the lab classes and if they could always be that way :). I of course explained them the phenomenon known as JIFASNIF.
There is no need to be afraid of pushing for open collaboration and research on the academic environment, it can be more interesting for the researcher and advantageous for the University, creating a win win scenario.
Node.js can make learning CS even more fun, nodeschool taught us that. There is a huge pool of people that would be very much excited to learn Node.js and use it for their own endeavours, even correctly introduced to it.
Using Node.js helps also the speed of learning, since it is so fast to hack something together, Q.A. sessions can be interactive and productive.
What can we do to increase Node.js adoption on college curriculums
I understand that is not easy to go to an University and change the curriculum altogether, in fact, my story required a syzygy of its own, it is very rare to be a professor if not already part of the University and or not a PhD student, teacher assistants (grad students) typically don’t have the right to weigh in on course decisions, even less making one from scratch. Nevertheless, there are some things we can do more easily:
Organize a NodeSchool event inside a University, planting the seed and watching it grow.
Become a mentor. I’ve given some help and guidance to younger students that showed interest in learning more. Now, some of them even have full time Node.js jobs or are just using Node.js for their Academic projects.
Try to tap into the research groups of Universities and ask what are their focus, ask if they have stuff open source and ask if you can contribute, the chain reaction that can come from making this questions might make a project open that wasn’t before
Stellar Module Management - Install your Node.js modules using IPFS
Node.js Interactive, the first Node.js conference organized by the Linux Foundation, happened on Dec 8-9 of 2015. There were hundreds of participants, and dozens of really amazing talks divided in 3 specific tracks: backend, frontend and IoT.
You can learn about that project in this blog post, check out the talk slides or wait for the video recording of the talk. I will update this blog post when that happens.
registry-mirror enables distributed discovery of npm modules by fetching and caching the latest state of npm through IPNS, the InterPlanetary Naming System. With this state, a node in the network is capable of querying IPFS network for an npm module’s cryptographic hash, fetching it from any peer that has it available.
In order to get started, you must first be sure that you are running IPFS 0.4.0. IPFS 0.4.0 is not yet released, but you can already use it by compiling from source or downloading the pre-built binary.
Please make sure you have go 1.5.2 or above installed.
Downloading pre-built Binary
Download the pre-built binary for your OS and Arch at gobuilder.
Installing and running registry-mirror
Once you have IPFS 0.4.0 available, install registry-mirror by running the following command (you should have Node.js 4 and npm 2 or above available):
$ npm i registry-mirror -g
Then start your IPFS daemon, run:
$ ipfs daemon
Swarm listening on /ip4/127.0.0.1/tcp/4001
Swarm listening on /ip4/172.19.248.69/tcp/4001
Swarm listening on /ip6/::1/tcp/4001
API server listening on /ip4/127.0.0.1/tcp/5001
Gateway (readonly) server listening on /ip4/127.0.0.1/tcp/8080
Daemon is ready
After, run registry-mirror daemon with the --ipfs option:
Now, to install a module using IPFS, you only need to set this local registry when running an npm install. This can be done through config or a command line argument:
$ npm i bignumber --registry=http://localhost:9595
npm http request GET http://localhost:9595/bignumber
npm http 200 http://localhost:9595/bignumber
npm http fetch GET http://localhost:9595/bignumber/-/bignumber-1.1.0.tgz
npm http fetch 200 http://localhost:9595/bignumber/-/bignumber-1.1.0.tgz
registry-mirror itself is quite a simple application, as most of the heavy lifting is done by IPFS. IPFS’s distributed nature affords a set of really nice features as a transport layer that registry-mirror leverages to create its service.
Find where the module lives without having to hit the backbone
With registry-mirror, a registry becomes a curated list of hashes. While the modules live in the network, as soon as registry-mirror caches this list locally (which it gets from the IPFS network), it has a list of the hashes of the modules that a user might need in the future. With this list, a user doesn’t have to know of the whereabouts of a module until it needs to request it from the network.
This list is fetched and kept up to date through IPNS. This ensures secure distribution, as IPNS records and validated with the publisher’s priate key.
Just like git, registry-mirror is able to work offline and/or in a disconnected scenario. As long as the module you are looking for exists in the network you are currently in, IPFS would be able to find it through its Peer and Content Routing (e.g. with a DHT).
Enable several registries to coexist
Once the notion of a registry becomes a curated list of modules available, enabling more than one registry to exist becomes simpler. This scenario can be especially interesting for private networks such as the ones within companies and organizations that don’t want their modules to be publicly known and available.
Run only what you were looking for
Just like git, IPFS verifies the content received using cryptographic hashing, making sure it is exactly what was requested — you can always be sure that what you are running is what you asked for.
By leveraging local and network caches efficiently, downloading your dependencies can be much faster as it avoids going to npm’s servers or CDN all the time. This can be crucial in high latency networks or more remote areas.