21 January 2026

Generalised Tensors in Idris

by André Videla & Bruno Gavranović

Generalised Tensors are the cornerstone of our machine learning research efforts. They’ve been developed by Bruno Gavranović, and they have been implemented in the TensorType library.

This blog post introduces generalised tensors from first principles and provides an implementation in Idris, a programming language with dependent types. This entire blog post is an Idris 2 literate file.

Tensors as we know them

Tensor is the name that we give for “vectors of vectors of …” of any depth. Here are some examples:

We can unify those three notions into one by saying that a scalar is a tensor of dimension 0 (sometimes also called “rank”)

Common name Tensor dimension Tensor shape
scalar 0 []
vector of length 3 1 [3]
vector of length n 1 [n]
3 × 2 matrix 2 [3, 2]
3 × 3 matrix 2 [3, 3]
n × m matrix 2 [n, m]
n × m × o tensor 3 [n, m, o]

In this way, we identify the shape of a tensor via a list of numbers. The depth of the list represent the level of “nesting” of the tensor, and the number at each position tells us how many elements we expect to find.

Tensors with Dependent Types

In most programming languages, tensors are represented with a plain type Tensor with no information about its size or depth. Dependent types help provide the necessary information to carry around. For example we can define vectors with their length in their type.

-- In Idris, a type-alias is the same as any other definition
Vec3 : Type
Vec3 = Vect 3 Double

We can define sized matrices with vectors of vectors:

Matrix : Nat -> Nat -> Type
Matrix n m = Vect n (Vect m Double)

A tensor of rank three is therefore given by a “vector of vectors of vectors”

Tensor3 : Nat -> Nat -> Nat -> Type
Tensor3 n m o = Vect n (Vect m (Vect o Double))

We see the pattern here, a tensor of dimension n is given by n iteration of Vect. For each iteration we also need the size of the vector. We can achieve this by parameterising our tensor by a list of numbers:

Tensor : List Nat -> Type
Tensor [] = Double -- When the dimension is 0 the tensor is a scalar
Tensor (n  -- `n` is the number of elements at the current dimension
        :: nested) -- `nested` is the list of nested dimensions
           -- When the dimension is 1 + m, we need to provide `n` tensors of dimension m
        = Fin n -> Tensor nested

We can build example vectors using the same examples given above.

namespace Examples1
  tensorScalar : Tensor []
  tensorScalar = 9

  tensorVec3 : Tensor [3]
  tensorVec3 0 = 7
  tensorVec3 1 = 8
  tensorVec3 2 = 0


  tensorMat3x3 : Tensor [3,3]
  tensorMat3x3 0 0 = 1
  tensorMat3x3 0 1 = 2
  tensorMat3x3 0 2 = 3
  tensorMat3x3 1 0 = 4
  tensorMat3x3 1 1 = 5
  tensorMat3x3 1 2 = 6
  tensorMat3x3 2 0 = 7
  tensorMat3x3 2 1 = 8
  tensorMat3x3 2 2 = 9

  tensor3x3x2 : Tensor [3,2,2]
  tensor3x3x2 0 0 0 = 4
  tensor3x3x2 0 0 1 = 5
  tensor3x3x2 0 1 0 = 7
  tensor3x3x2 0 1 1 = 9
  tensor3x3x2 1 0 0 = 3
  tensor3x3x2 1 0 1 = 3
  tensor3x3x2 1 1 0 = 9
  tensor3x3x2 1 1 1 = 0
  tensor3x3x2 2 0 0 = 1
  tensor3x3x2 2 0 1 = 2
  tensor3x3x2 2 1 0 = 3
  tensor3x3x2 2 1 1 = 4

The above definition is not super ergonomic since we need to write a function that will map every coordinate to a value, but it will reveal something important later.

For the sake of example, here is a much more ergonomic definition that requires the user to provide a nested vector of values

TensorN : List Nat -> Type
TensorN [] = Double -- When the dimension is 0 the tensor is a scalar
TensorN (n  :: nested)  = Vect n (TensorN nested)

And the examples using it are a bit more legible.

namespace Examples2
  tensorScalar : TensorN []
  tensorScalar = 9

  tensorVec3 : TensorN [3]
  tensorVec3 = [1,2,3]


  tensorMat3x3 : TensorN [3,3]
  tensorMat3x3 = [[1,2,3],
                  [4,5,6],
                  [7,8,9]]

  tensor3x3x2 : TensorN [3,2,2]
  tensor3x3x2 = [[[1,2],[4,5]],
                 [[9,8],[7,6]],
                 [[4,5],[3,2]]]

We can now recognise tensors as we know them, indicating that our representation for arbitrary tensors is correct. Fundamentally TensorN and Tensor are the same, but the first one is the one we need for generalised tensors using containers.

Containers & Type Descriptions

A very important part of our work is centered around containers & their morphisms. This mathematical object has a very simple definition in programming, it’s a dependent type!

private typebind infixr 0 !>
record Container where
  constructor (!>)
  shapes : Type -- the set of possible shapes for a data type
  position : shapes -> Type -- the set of positions that hold values for any given shape

The most important part for us is why they are called containers, or sometimes “descriptions”: They represent inductive data types. Here are a couple of well-known containers:

-- 'List' container, usually described as `List a = Nil | Cons a (List a)`
ListDesc : Container
ListDesc = (len : Nat) !> Fin len

-- 'Maybe' container, usually defined as `Maybe a = Nothing | Just a`
MaybeDesc : Container
MaybeDesc = (isNil : Bool) !> if isNil then Void else Unit

-- 'Pair' container of two elements of the same type, usually defined as `Pair a = MkPair a a`
PairDesc : Container
PairDesc = (_ : Unit) !> Bool

-- 'Either' container of two elements of the same type, usually defined as `Either a = Success a | Failure a`
EitherDesc : Container
EitherDesc = (isSuccess : Bool) !> Unit

Containers are called so because the describe data types that contain data, like List or Maybe, but the ListDec and MaybeDesc are not actually types but merely a descriptions of it. To turn them into a type we need to interpret them. This interpretation is done via the extension of a container.

record Ext (c : Container) (a : Type) where
  constructor MkExt
  shape : c.shapes -- A particular shape from the set of possible shapes
  values : c.position shape -> a -- for each position in the shape, give a value

From the extension and the description, we can build the list Type.

ListType : Type -> Type
ListType = Ext ListDesc

As expected the List Type has the form Type -> Type meaning it will contain data of a another type. Here is an example of a list of string built from our description:

listOfStrings : ListType String
listOfStrings = MkExt 3 $
   \case 0 => "hello"
         1 => "world"
         2 => "!!!"

The syntax is a bit wonky because we have to first provide the shape of the list, here 3, and then for each possible position, provide a value. Since there are three possible positions in a list of 3 elements, we write a function that returns a string for each index of the string.

shape : 3, aka lists of 3 elements
position: 0         1       2
          |         |       |
          V         V       V
      [ "hello", "world", "!!!" ]

Combining Tensors and Containers

At first, containers seem like uncessary complexity around data types. Why deal with the overhead of defining containers via shapes, positions, and then extensions when we already have algebraic data types available to us?

Our insight comes from Tensor whose definition we repeat here.

              | shape
              V
Tensor : List Nat -> Type
Tensor [] = Double -- When the dimension is 0 the tensor is a scalar
Tensor (n   :: nested)  = Fin n -> Tensor nested
                            ^
                            | position

Like with containers, a tensor works by assigning values to each position given by a list of shapes. Here the shapes are Nat, the length of the vector at a given dimension. The values are given by Fin n -> Tensor nested saying “for each position in the vector, there is a nested tensor”.

Recall that the definition of the container for the List type is (len : Nat) !> Fin len, meaning that we can substitute Nat and Fin n in the above definition to generalise the tensor for any container:

ContTensor : (c : Container) -> List c.shapes -> Type
ContTensor c [] = Double -- scalar case
ContTensor c (s :: nested) = c.position s -> ContTensor c nested -- nested case

And we now have that Tensor = ContTensor ListDesc, that is, a good’ol tensor is a tensor defined by an iterated list container!

To verify this statement, we can replace our definition of 3×3 matrix by one using our ContTensor

namespace Example3
  tensorMat3x3 : ContTensor ListDesc [3,3]
  tensorMat3x3 0 0 = 1
  tensorMat3x3 0 1 = 2
  tensorMat3x3 0 2 = 3
  tensorMat3x3 1 0 = 4
  tensorMat3x3 1 1 = 5
  tensorMat3x3 1 2 = 6
  tensorMat3x3 2 0 = 7
  tensorMat3x3 2 1 = 8
  tensorMat3x3 2 2 = 9

This means that we can replace ListDesc by any other container and obtain a rich tensorial structure. And now we can build special tensors using those containers:

PairTensor : Nat -> Type
PairTensor depth = ContTensor PairDesc (replicate depth ())

MaybeTensor : List Bool -> Type
MaybeTensor = ContTensor MaybeDesc

EitherTensor : List Bool -> Type
EitherTensor = ContTensor EitherDesc

The first one denotes pairs nested n times, so it’s like a n nested tensor of 2 elements at each dimesion.

The second one is a nested sequence of Maybe values, we can better see what it does by providing examples:

maybeTensorValue1 : MaybeTensor [False, False, False]
maybeTensorValue1 () () () = 3

maybeTensorValue2 : MaybeTensor [False, True, False]
maybeTensorValue2 () isNil () = absurd isNil

The True in maybeTensorValue2 indicate that at the first level of nesting, we encounter a nil value at which point, there is no value to return. This is like the type Maybe (Maybe (Maybe Double)) inhabited by the value Just Nothing.

The third tensor is sequence of binary choices where the argument denotes the path taken to reach a value.

--            *
--           / \
--          *
--         / \
--        *
--       / \
--          3
leftLeftRight : EitherTensor [False, False, True]
leftLeftRight () () () = 3

--         *
--        / \
--          *
--         / \
--        *
--       / \
--          4
rightLeftRight : EitherTensor [True, False, True]
rightLeftRight () () () = 4

This is fantastic! We have a way to describe tensors of arbitrary depth for arbitrary data types, not just matrices of numbers. We can do even better: we can make a tensor out of multiple different containers. First we need to talk about lists.

Heterogenous lists and Generalised Tensors

Lists are types Type -> Type, they take a type, return the type of lists containing that type repeated any number of times. List Int is a list of integers, List String is a list of strings, once we pick a type we cannot mix in different types in that list.

Sometimes, this notion of list is too inflexible, because we want to store different data inside a list, but the types won’t permit it, you cannot insert a String in a List Int.

Dependent types, solve this problem in a type-safe way with heterogenous lists. They are lists that are parameterised not by a single type, but by a list of types for each value in the list.

Here is the definition, although it’s a bit complicated we’re going to look at examples very soon.

data All : (a -> Type) -> List a -> Type where
  Nil : All p []
  (::) : {0 xs : List a} -> p x -> All p xs -> All p (x :: xs)

-- heterogenous lists
HList : List Type -> Type
HList types = All id types

We first define All which is the parent definition for heterogenous lists, and then define HList as a special case of All. This gives us everything we need to define a list of integers and then add a string to it:

integers : HList [Int, Int, Int]
integers = [1,2,3]

integersStrings : HList ? -- [String, Int, Int, Int]
integersStrings = "hello" :: integers

The integers type is a list of numbers and integersStrings add a string to this list of numbers. The types are inferred using ? but if we were to write down the type of integersString it would be [String, Int, Int, Int].

This really shows the power of dependent types, we can combine the flexibility of programming languages like javascript while keeping the type-safety of programming languages like Rust or Haskell.

Like the List type, our container-based tensors are parameterised over one Container, can we write a notion of tensor that is like HList where each layer of the dimension of the tensor is defined by a different container? Absolutely but for this we need to use every thing we just learned and combine them together, first let’s recall the definition of ContTensor.

ContTensor : (c : Container) -> List c.shapes -> Type
ContTensor c [] = Double -- scalar case
ContTensor c (s :: nested) = c.position s -> ContTensor c nested -- nested case

It works by giving a single Container and a list of shapes for that container. What we would like to do instead is give a list of containers that define what shape is the tensor for each dimension and then, as second argument, take a heterogenous list of shapes for every container given. Something like

failing "Mismatch between: Container and Type."
  attempt : (containers : List Container) -> HList containers -> Type

This type does not quite work because HList expects a list of Type but we’re giving it a list of Container. But we’re close! Instead of using HList, we’re going to use its more general type All and say “for each container given as argument, extract the shape, and expect a value of that type”.

GeneralisedTensor : (containers : List Container) -> All shapes containers -> Type

This is the proper type for this definition, GeneralisedTensor is the equivalent of HList but for container-based tensors. We adapt the definition to handle a list of containers as first argument, but the changes are minor.

GeneralisedTensor [] [] = Double
GeneralisedTensor (c :: cs) (shapeHere :: shapes) = c.position shapeHere -> GeneralisedTensor cs shapes

This is almost the definition we are working with in TensorType. Let’s understand this definition better before moving further. How would something we know look in this definition? Here is a traditional 3x2x2 tensors represented in this way:

OverkillTensor : Type
OverkillTensor = GeneralisedTensor [ListDesc, ListDesc, ListDesc] [3,2,2]

But we can do much more! For example, here is the tensor that maybe contains a list of length 5

MaybeList : (isNil : Bool) -> Type
MaybeList isNil = GeneralisedTensor [MaybeDesc, ListDesc] [isNil, 5]

And here the type of a 3x3 matrix containing lists of Maybe values containing vectors of length 5.

ComplexTensor : (len : Nat) -> (isNil : Bool) -> Type
ComplexTensor len isNil =
  GeneralisedTensor
    [ListDesc, ListDesc, ListDesc, MaybeDesc, ListDesc]
    [3, 3, len, isNil, 5]

The final step to the definition in TensorType is to realise we do not even need to specify upfront things like len or isNil.

A tensor, simply put, is fold over containers using something special called the “composition product” of containers.

This yields a strikingly simple formula

Tensor : List Container -> Container
Tensor = foldr (>@) Scalar

but one we leave for another time.

Conclusion

We’ve seen how to go from traditional tensor definitions to generalised tensors using examples powered by dependent types. This a formidable example showing how dependent types enable the kind of program reasoning that merges theoretical knowledge with real-world program executable on CPUs and GPUs.

We used Idris because it’s simple core and powerful FFI gives us a clear path toward performant code with complex types. If you prefer another language, you can replicate those definition in any dependently-typed programming language like Rocq, Agda and Lean! We are working on the latter to study and hopefully improve performance of AI agents for proofs using some of the technology presented here. This is just the tip of the iceberg and there is much more we’re excited to share in the year 2026.

tags: containers - poly - category theory - idris - machine learning