Scaling up machine learning techniques to keep pace with "big data" is its own interesting engineering problem. Recently, Apache Spark has become a popular framework for large-scale ML. Sean introduces the intuition behind modern large-scale recommenders and highlights four ways ML algorithms are engineered to scale.
Additional resources: