Declarative Machine Learning

Expressive, Extensible and Concise

In Wolfe you use Scala to define machine learning models in terms of real valued functions such as densities, energy functions and empirical losses—almost as concise as in a machine learning paper. This paradigm enables conditional random fields, generative models, markov logic networks, matrix factorization and more. See examples below and follow our interactive tutorials.

  • //transition and observation features of a linear chain CRF
    def f(s:Sentence) = {
      val n = s.words.size
      sum (0 until n) { i=> oneHot(s.words(i)->s.tags(i))} +
      sum (0 until n-1) { i=> oneHot(s.tags(i)->s.tags(i+1))}}
    
    //the corresponding linear model
    def s(w:Vector)(s:Sentence) = w dot f(s)
    
  • // MAP inference
    def h(w:Vector)(x:Sentence):Sentence =
      argmax(sentences st (obs(_)==obs(x))){ d=>s(w)(d) }
    
    // Loss over training data
    def loss(data:Seq[Sentence])(w:Vector):Double =
      sum(data){ d=>s(w)(h(w)(d))-s(w)(d) }
    
    // Parameter estimation
    def learn(data:Seq[Sentence]) = argmin(vectors){w=> loss(data)(w)}
    
  • // latent factorization and neighborhood model
    def s(w:Vector)(d:UserItem) =
      sum(0 until k){ i => w(u.item->i)*w(u.user->i) } +
      sum(u.user.items){ i => w(i->u.item) }
    
    // training loss over observed cells
    def loss(data:Seq[UserItem])(w:Vector) =
      sum(data){ d => pow2(d.rating - s(w)(d))}