Andrew Ng is a co-founder of Coursera and the director of the Stanford AI Lab. In 2011 he led the development of Stanford University’s main MOOC (Massive Open Online Courses) platform and also taught an online Machine Learning class that was offered to over 100,000 students, leading to the founding of Coursera.
A few months ago, I spoke at the conference where I explained the difference between caching and an in-memory data grid. Today, having realized that many people are also looking to better understand the difference between two major categories in in-memory computing: In-Memory Database and In-Memory Data Grid, I am sharing the succinct version of my thinking on this topic
Nathan Marz came up with the term Lambda Architecture (LA) for a generic, scalable and fault-tolerant data processing architecture, based on his experience working on distributed data processing systems at Backtype and Twitter.
It seems like everyone is trying to learn to code: Code.org has celebrities like Bill Gates, Mark Zuckerberg, and Chris Bosh telling you anyone can code; CoderDojo's are springing up all over the country; the UK has made it part of their official curriculum for all grade school kids.
R 3.1.1 (codename “Sock it to Me“) was released today! You can get the latest binaries version from here. (or the .tar.gz source code from here). The full list of new features and bug fixes is provided below. Upgrading to R 3.1.1 on Windows If you are using Windows you can easily upgrade to the latest version of R using the […]
Roughly a year ago I published an article about parallel computing in R here, in which I compared computation performance among 4 packages that provide R with parallel features once R is essentially a single-thread task package.