Nx Logo

High performance Numerical Elixir with Nx

Rob Ellen Portrait

Rob Ellen

23 September 2021 · 4 min read

Numerical performance of Elixir code has received a massive boost with the release of Nx - Numerical Elixir.

Over the past few months José Valim has been dropping cryptic hints about an exciting, but mysterious new Elixir project called Nx. A series of tweets, alluding to big performance improvements got the community speculating as to what Nx could be.

The wait was finally over on Feb 9th, when on the Thinking Elixir podcast, all was revealed. Nx is Numerical Elixir, a tensor programming library for Elixir. With Nx, we can develop tensor (multi-dimensional array) programs in Elixir and use a much faster backend to run them on the CPU or GPU. This is very exciting as it opens up a whole new set of use cases for Elixir in data science and machine learning and other numerically focussed workloads.

A new numerical function definition keyword, defn, is also introduced, wherein we can use a large subset of Elixir syntax to develop Nx programs. Numerical definitions are compiled to computation graphs that describe the operations we want to perform, which are in turn compiled to efficient CPU or GPU code. Early benchmarks show several orders of magnitude performance gains over plain Elixir, in some cases there are speed improvements of almost 5000x!

Numbats and Numerical Notebooks?

As a nice touch for us Australian alchemists, the mascot of the Nx project is the numbat, a native Australian marsupial. The official Nx code repo on GitHub also has a cute numbat logo.

To see what all the fuss is about, Robert from Alembic (@robertellen) developed a code notebook that explores the Nx API, and presented the notebook at the Elixir Australia March meetup.

  • The video of the talk is now up on YouTube
  • The repository hosting the code notebook is available on Github.

You'll need to download the code notebook and run as a Phoenix LiveView application locally as this isn't protected against arbitrary code execution (so we can't host it yet). José has been recently hinting at the possibility of hosted notebooks though, so this may be possible soon?

Tensors are multi dimensional arrays

Nx, at its core, is tensor programming library. In the context of Nx, tensors are multi-dimensional arrays. For anyone familiar with TensorFlow, PyTorch, or NumPy, tensors in Nx are equivalent to the tensors or multi-dimensional arrays in those environments.

To begin working with Nx, tensors are created from nested lists as well as binaries, and Nx.reshape allows a shape to be given to a flat binary.

iex(1)> Nx.tensor([[1,2,3],[4,5,6]])
#Nx.Tensor<
  s64[2][3]
  [
    [1, 2, 3],
    [4, 5, 6]
  ]
>
iex(1)> t = Nx.from_binary(<<1::integer,2::integer,3::integer,4::integer>>, {:u,8})
#Nx.Tensor<
  u8[4]
  [1, 2, 3, 4]
>
iex(2)> t |> Nx.reshape({2,2})
#Nx.Tensor<
  u8[2][2]
  [
    [1, 2],
    [3, 4]
  ]
>

Tensors can also be created from shapes (e.g. {2,3} for a 2-by-3 array) in a variety of ways with functions such as Nx.iota (creates an array of values incrementing from zero) or Nx.broadcast (adds dimensions to an array by duplicating the data in the array).

Tensor-aware functions

In Nx, we don't have to write loops over the elements of our tensors like we would if they were lists or maps. A broad range of tensor-aware functions exist in the Nx library that will perform operations on tensors without explicit loops (maps or reduces). This both makes the code terse and more performant.

Examples of tensor-aware functions are:

  • unary functions such as Nx.exp
iex(16)> [[1.0, 2.0],[3.0,4.0]] |> Nx.tensor |> Nx.exp
#Nx.Tensor<
  f32[2][2]
  [
    [2.7182817459106445, 7.389056205749512],
    [20.08553695678711, 54.598148345947266]
  ]
>
  • binary functions such as Nx.add
iex(1)> Nx.add(Nx.tensor([1,2,3,4]), Nx.tensor([5,6,7,8]))
#Nx.Tensor<
  s64[4]
  [6, 8, 10, 12]
>
  • and, aggregate functions such as Nx.sum
iex(1)> [[1.0,2.0],[3.0,4.0]] |> Nx.tensor |> Nx.sum
#Nx.Tensor<
  f32
  10.0
>

The tensor-aware operations are the key to the potential speed-ups that Nx provides. Each backend for Nx implements efficient versions of the operations that execute in parallel.

Further to an API of tensor operations, Nx introduces Numerical Definitions to make developing tensor code more natural. By defining a function with defn instead of the usual def, we can write tensor code using a broad subset of Elixir syntax, including tensor-aware version of operators we know from Kernel, such as +, -, *, and /. The canonical example of using numerical definitions is the softmax function:

defmodule Formula do
  import Nx.Defn

  defn softmax(t) do
    Nx.exp(t) / Nx.sum(Nx.exp(t))
  end
end

Easily Configurable Backends

Currently, the Nx contributors have implemented backends for Nx in EXLA and Torchx, which are Elixir clients for Google's XLA (Accelerated Linear Algebra), and LibTorch, respectively. Both support CPU or GPU-based acceleration of Nx functions and numerical definitions, and below we can see how easy it is to specify different backend configurations globally or even per tensor.

Nx.tensor([1,2,3,4], backend: Torchx.Backend)

@default_defn_compiler {EXLA, run_options: [keep_on_device: true]}

Benchmarks show 5000x speed boost!

The Nx GitHub repo presents a nice benchmark for the softmax numerical definition:

Comparison:
xla gpu f32 keep      15308.14
xla gpu f64 keep       4550.59 - 3.36x slower +0.154 ms
xla cpu f32             434.21 - 35.26x slower +2.24 ms
xla gpu f32             398.45 - 38.42x slower +2.44 ms
xla gpu f64             190.27 - 80.46x slower +5.19 ms
xla cpu f64             168.25 - 90.98x slower +5.88 ms
elixir f32                3.22 - 4760.93x slower +310.94 ms
elixir f64                3.11 - 4924.56x slower +321.63 ms

The takeaway that using 32-bit floating point tensors, keeping the tensors on the GPU memory produces massive performance gains.

Nx represents a new frontier for the Elixir ecosystem. Adding high performance numerical computation to Elixir allows us to solve more problems on the BEAM and will hopefully make Elixir an attractive choice of language in more domains. We commend the Nx contributors for such a great initiative. We're looking forward to seeing where Nx takes Elixir and the BEAM.

https://twitter.com/ElixirSydney/status/1373203685006712837

Elixir Australia

Elixir Australia is a joint effort by Elixir Sydney (sponsored by Alembic) and Elixir Melbourne to host monthly online meetups for the Elixir community. We encourage all Elixir alchemists globally to join either of the meetup groups to keep an eye out for upcoming events (the next scheduled meetup is Wednesday, April 21, 2021 7:00 PM to 9:00 PM GMT+10). The YouTube channel can also be subscribed to to keep up to date with videos from the meetups.

Please check out the links and we hope to see you at an Elixir Australia meetup soon!

Links

Rob Ellen Portrait

WRITTEN BY

Rob Ellen

I am software developer with a focus on soft real-time systems and analytics in telecommunications and industrial control systems. I'm passionate about system reliability and performance, and employ techniques such as functional programming, type-safety, automated testing, and observability to improve software quality. I like bad puns and loud shirts.

More articles

Customers providing feedback on a product

Strategy & Advisory

Do your tech choices empower a test and learn cycle?

Ben Melbourne profile picture

Ben Melbourne

1 February 2024 – 2 min read

Padlock constructed from triangles as imagined by Midjourney

Ash

Customising Ash Authentication with Phoenix LiveView

Will Tan

Will Tan

19 March 2023 – 4 min read

Want to read more?

The latest news, articles, and resources, sent to your inbox occasionally.