HN Gopher Feed (2017-12-05) - page 1 of 10 ___________________________________________________________________
Neural Networks in JavaScript with Deeplearn.js
105 points by rwieruch
https://www.robinwieruch.de/neural-networks-deeplearnjs-javascript/-javascript/___________________________________________________________________
XCSme - 29 minutes ago
I would also recommend Synaptic.js http://caza.la/synaptic/ It's
easy to get started with, very good for learning the basics of NNs
very quickly.
visarga - 1 hours ago
How does it compare to regular Python DL frameworks?
ericand - 1 hours ago
This explanation from a Google blogpost helped me:The API mimics
the structure of TensorFlow and NumPy, with a delayed execution
model for training (like TensorFlow), and an immediate execution
model for inference (like NumPy). We have also implemented
versions of some of the most commonly-used TensorFlow operations.
With the release of deeplearn.js, we will be providing tools to
export weights from TensorFlow checkpoints, which will allow
authors to import them into web pages for deeplearn.js
inference.https://research.googleblog.com/2017/08/harness-power-
of-mac...
amelius - 49 minutes ago
What is so appealing about a delayed execution model? Why can't
we just perform tensor math as in numpy, and let the library
figure out the fastest way to do it behind the scenes? I think
the whole "graph" approach is making things needlessly
complicated.
nsthorat - 28 minutes ago
Author of deeplearnjs here. We hear you, and we 100% agree.
Stay tuned.
amelius - moments ago
That's great to hear.By the way, if you'd make your
interface more general than "deep learning", your library
could be the start of an alternative for numpy/scipy on JS,
and it would be even faster than the original Python
version because it uses the GPU. Just a thought ...(One
downside is that JS doesn't have the nice operator
overloading that Python has)
dsmilkov - 19 minutes ago
Check out our roadmap:
https://deeplearnjs.org/docs/roadmap.html (we are working on
eager mode / define-by-run computation)
yazr - 1 hours ago
Does anyone have performance figures ?Is is 10x 100x 3x slower
than standalone GPU lib ?!
jorgemf - 4 minutes ago
It uses WebGL to use the GPU, so probably is closer to other
standalone libs (but I guess it will depend on the kernels
the Networks uses and how WebGL can handle them)
ericand - 1 hours ago
If you are looking for it (like I was), here's a direct link to the
deeplearn.js project: https://deeplearnjs.org/
rwieruch - 46 minutes ago
OP here: Please checkout the GitHub repository of the Neural
Network in JS too [0].Would love to see some exciting discussions
around machine learning in JS. I am exploring the topic heavily at
the moment and I am keen to apply/write/educate/learn more about
it.- [0] https://github.com/javascript-machine-learning/color-
accessi...
nerfhammer - 1 hours ago
I was going to write a cynical complaint about how it's hardly
useful without GPU support... but it's using WebGL to hit the GPU.
Of course. And it's probably a million times easier than trying
install a TensorFlow stack locally on your desktop.
rawnlq - 1 hours ago
Dumb question, but can someone give me a summary of how you can
implement this in webgl?I thought there are only vertex/fragment
shaders and compute shaders aren't supported yet? Do you just
pretend everything is pixel data?
nsthorat - 30 minutes ago
Author of deeplearn.js here. A quick summary:We store NDArrays
as floating point WebGLTextures (in rgba channels).
Mathematical operations are defined as fragment shaders that
operate on WebGLTextures and produce new WebGLTextures.The
fragment shaders we write operate in the context of a single
output value of our result NDArray, which gets parallelized by
the WebGL stack. This is how we get the performance that we do.
SoapSeller - 13 minutes ago
Which... is pretty much how GPGPU started in the early 2000.
Sad/funny how we go through this cycle again.It will be
interesting to see if the industry will produce a standard
for GPGPU in the browser. Giving that the desktop standard is
less common than a proprietary standard.
tehsauce - 28 minutes ago
Compute shaders are not supported in webgl, but is possible to
perform vector operations by rendering to a texture with the
fragment shader. It's basically a hack. The trick is rendering
an image without putting it on the screen, storing arbitrary
data into the pixels. This has limitations, but is actually
good enough for many vector and matrix operations. I believe
this method was even used with desktop opengl when gpus were
first being used for general computing and didn't have more
flexible apis yet like opencl/cuda
kaoD - 39 minutes ago
> Do you just pretend everything is pixel data?Pretty much.It's
inconvenient because shaders are meant for vertices/fragments
but it still works.
Houshalter - 26 minutes ago
It's super sad to me, that the only convenient cross platform way
to do deep learning is a hack on top of JS.
jorgemf - moments ago
When you want performance the only way is getting close to the
hardware. That is why deep learning is usually linux+nvidia.
That is why game engines use/used C or even assemble code for
critical parts.Don't expect anything better soon as companies
focus on performance per watt and will only develop things in
that path.
connorelsea - 23 minutes ago
Not sad - this is the beauty of modern JS.
nsthorat - 21 minutes ago
This is just the beginning :)
rwieruch - 11 minutes ago
<3
state_less - 3 minutes ago
I've been waiting years for this. Once compute shaders became
available to the browser I planned to finally get around to
writing some code because you could project out to so many
platforms. This sucks the entropy out of the coding task. You
don't have to have multiple implementations, with unique bugs
on multiple platforms.JS has improved over the year but you can
also go with a typed language if you wanted like purescript or
typescript.
jorgemf - 6 minutes ago
I hope web pages don't grow to hundreds of megabytes just because
they have a Neural Network embedded.