Scala, Spark and Deeplearning4j
Scala is one of the most exciting languages to be created in the 21st century. It is a multi-paradigm language that fully supports functional, object-oriented, imperative and concurrent programming. It also has a strong type system, and from our point of view, strong type is a convenient form of self-documenting code.
Scala works on the JVM and has access to the riches of the Java ecosystem, but it is less verbose than Java. As we employ it for ND4J, its syntax is strikingly similar to Python, a language that many data scientists are comfortable with. Like Python, Scala makes programmers happy, but like Java, it is quite fast.
Finally, Apache Spark is written in Scala, and any library that purports to work on distributed run times should at the very least be able to interface with Spark. Deeplearning4j and ND4J go a step further, because they work in a Spark cluster, and boast Scala APIs called ScalNet and ND4S.
We believe Scala’s many strengths will lead it to dominate numerical computing, as well as deep learning. We think that will happen on Spark. And we have tried to build the tools to make it happen now.
- Deeplearning4j on Spark
- Learn: The Scala Programming Language
- A Scala Tutorial for Java programmers (PDF)
- Scala By Example, by Martin Odersky (PDF)
- An Intro to Scala on ND4J
- Our early-stage Scala API: (One example on Github)
- SF Spark Talk: Deeplearning4j on Spark, and Data Science on the JVM, with ND4J
- Q&A with Adam Gibson about Spark with Alexy Khrabrov
- Our Spark integration
- ND4J: Scientific Computing for the JVM
- Scala Basics for Python Developers
- Why We Love Scala at Coursera
A non-exhaustive list of organizations using Scala:
- Bank of America
- Credit Suisse
- (The) Guardian
- (The) Weather Channel