{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\"Chisel" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "# Module 3.4: Functional Programming\n", "**Prev: [Higher-Order Functions](3.3_higher-order_functions.ipynb)**
\n", "**Next: [Object Oriented Programming](3.5_object_oriented_programming.ipynb)**\n", "\n", "## Motivation\n", "You saw functions in many previous modules, but now it's time to make our own and use them effectively.\n", "\n", "## Setup" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "val path = System.getProperty(\"user.dir\") + \"/source/load-ivy.sc\"\n", "interp.load.module(ammonite.ops.Path(java.nio.file.FileSystems.getDefault().getPath(path)))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "This module uses the Chisel `FixedPoint` type, which currently resides in the experimental package." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import chisel3._\n", "import chisel3.util._\n", "import chisel3.tester._\n", "import chisel3.tester.RawTester.test\n", "import chisel3.experimental._\n", "import chisel3.internal.firrtl.KnownBinaryPoint" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "# Functional Programming in Scala\n", "Scala functions were introduced in Module 1, and you saw them being used a lot in the previous module. Here's a refresher on functions. Functions take any number of inputs and produce one output. Inputs are often called arguments to a function. To produce no output, return the `Unit` type. \n", "\n", "**Example: Custom Functions**
\n", "Below are some examples of functions in Scala." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// No inputs or outputs (two versions).\n", "def hello1(): Unit = print(\"Hello!\")\n", "def hello2 = print(\"Hello again!\")\n", "\n", "// Math operation: one input and one output.\n", "def times2(x: Int): Int = 2 * x\n", "\n", "// Inputs can have default values, and explicitly specifying the return type is optional.\n", "// Note that we recommend specifying the return types to avoid surprises/bugs.\n", "def timesN(x: Int, n: Int = 2) = n * x\n", "\n", "// Call the functions listed above.\n", "hello1()\n", "hello2\n", "times2(4)\n", "timesN(4) // no need to specify n to use the default value\n", "timesN(4, 3) // argument order is the same as the order where the function was defined\n", "timesN(n=7, x=2) // arguments may be reordered and assigned to explicitly" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Functions as Objects\n", "Functions in Scala are first-class objects. That means we can assign a function to a `val` and pass it to classes, objects, or other functions as an argument.\n", "\n", "**Example: Function Objects**
\n", "Below are the same functions implemented as functions and as objects." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// These are normal functions.\n", "def plus1funct(x: Int): Int = x + 1\n", "def times2funct(x: Int): Int = x * 2\n", "\n", "// These are functions as vals.\n", "// The first one explicitly specifies the return type.\n", "val plus1val: Int => Int = x => x + 1\n", "val times2val = (x: Int) => x * 2\n", "\n", "// Calling both looks the same.\n", "plus1funct(4)\n", "plus1val(4)\n", "plus1funct(x=4)\n", "//plus1val(x=4) // this doesn't work" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Why would you want to create a `val` instead of a `def`? With a `val`, you can now pass the function around to other functions, as shown below. You can even create your own functions that accept other functions as arguments. Formally, functions that take or produce functions are called *higher-order functions*. You saw them used in the last module, but now you'll make your own!\n", "\n", "**Example: Higher-Order Functions**
\n", "Here we show `map` again, and we also create a new function, `opN`, that accepts a function, `op`, as an argument." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// create our function\n", "val plus1 = (x: Int) => x + 1\n", "val times2 = (x: Int) => x * 2\n", "\n", "// pass it to map, a list function\n", "val myList = List(1, 2, 5, 9)\n", "val myListPlus = myList.map(plus1)\n", "val myListTimes = myList.map(times2)\n", "\n", "// create a custom function, which performs an operation on X N times using recursion\n", "def opN(x: Int, n: Int, op: Int => Int): Int = {\n", " if (n <= 0) { x }\n", " else { opN(op(x), n-1, op) }\n", "}\n", "\n", "opN(7, 3, plus1)\n", "opN(7, 3, times2)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Example: Functions vs. Objects**
\n", "A possibly confusing situation arises when using functions without arguments. Functions are evaluated every time they are called, while `val`s are evaluated at instantiation. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "import scala.util.Random\n", "\n", "// both x and y call the nextInt function, but x is evaluated immediately and y is a function\n", "val x = Random.nextInt\n", "def y = Random.nextInt\n", "\n", "// x was previously evaluated, so it is a constant\n", "println(s\"x = $x\")\n", "println(s\"x = $x\")\n", "\n", "// y is a function and gets reevaluated at each call, thus these produce different results\n", "println(s\"y = $y\")\n", "println(s\"y = $y\")" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## Anonymous Functions\n", "As the name implies, anonymous functions are nameless. There's no need to create a `val` for a function if we'll only use it once. \n", "\n", "**Example: Anonymous Functions**
\n", "The following example demonstrates this. They are often scoped (put in curly braces instead of parentheses). " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "val myList = List(5, 6, 7, 8)\n", "\n", "// add one to every item in the list using an anonymous function\n", "// arguments get passed to the underscore variable\n", "// these all do the same thing\n", "myList.map( (x:Int) => x + 1 )\n", "myList.map(_ + 1)\n", "\n", "// a common situation is to use case statements within an anonymous function\n", "val myAnyList = List(1, 2, \"3\", 4L, myList)\n", "myAnyList.map {\n", " case (_:Int|_:Long) => \"Number\"\n", " case _:String => \"String\"\n", " case _ => \"error\"\n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Exercise: Sequence Manipulation**
\n", "A common set of higher-order functions you'll use are `scanLeft`/`scanRight`, `reduceLeft`/`reduceRight`, and `foldLeft`/`foldRight`. It's important to understand how each one works and when to use them. The default directions for `scan`, `reduce`, and `fold` are left, though this is not guaranteed for all cases. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "val exList = List(1, 5, 7, 100)\n", "\n", "// write a custom function to add two numbers, then use reduce to find the sum of all values in exList\n", "def add(a: Int, b: Int): Int = ???\n", "val sum = ???\n", "\n", "// find the sum of exList using an anonymous function (hint: you've seen this before!)\n", "val anon_sum = ???\n", "\n", "// find the moving average of exList from right to left using scan; make the result (ma2) a list of doubles\n", "def avg(a: Int, b: Double): Double = ???\n", "val ma2 = ???" ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "assert(add(88, 88) == 176)\n", "assert(sum == 113)\n", "\n", "assert(anon_sum == 113)\n", "\n", "assert(avg(100, 100.0) == 100.0)\n", "assert(ma2 == List(8.875, 16.75, 28.5, 50.0, 0.0))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", "\n", "\n", "
\n", "
\n",
    "def add(a: Int, b: Int): Int = a + b\n",
    "val sum = exList.reduce(add)\n",
    "\n",
    "val anon\\_sum = exList.reduce(\\_ + \\_)\n",
    "\n",
    "def avg(a: Int, b: Double): Double = (a + b)/2.0\n",
    "val ma2 = exList.scanRight(0.0)(avg)\n",
    "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "# Functional Programming in Chisel\n", "Let's look at some examples of how to use functional programming when creating hardware generators in Chisel." ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "**Example: FIR Filter**
\n", "First, we'll revisit the FIR filter from previous examples. Instead of passing in the coefficients as parameters to the class or making them programmable, we'll pass a function to the FIR that defines how the window coefficients are calculated. This function will take the window length and bitwidth to produce a scaled list of coefficients. Here are two example windows. To avoid fractions, we'll scale the coefficients to be between the max and min integer values. For more on these windows, check out the [this Wikipedia page](https://en.wikipedia.org/wiki/Window_function)." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// get some math functions\n", "import scala.math.{abs, round, cos, Pi, pow}\n", "\n", "// simple triangular window\n", "val TriangularWindow: (Int, Int) => Seq[Int] = (length, bitwidth) => {\n", " val raw_coeffs = (0 until length).map( (x:Int) => 1-abs((x.toDouble-(length-1)/2.0)/((length-1)/2.0)) )\n", " val scaled_coeffs = raw_coeffs.map( (x: Double) => round(x * pow(2, bitwidth)).toInt)\n", " scaled_coeffs\n", "}\n", "\n", "// Hamming window\n", "val HammingWindow: (Int, Int) => Seq[Int] = (length, bitwidth) => {\n", " val raw_coeffs = (0 until length).map( (x: Int) => 0.54 - 0.46*cos(2*Pi*x/(length-1)))\n", " val scaled_coeffs = raw_coeffs.map( (x: Double) => round(x * pow(2, bitwidth)).toInt)\n", " scaled_coeffs\n", "}\n", "\n", "// check it out! first argument is the window length, and second argument is the bitwidth\n", "TriangularWindow(10, 16)\n", "HammingWindow(10, 16)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now we'll create a FIR filter that accepts a window function as the argument. This allows us to define new windows later on and retain the same FIR generator. It also allows us to independently size the FIR, knowing the window will be recalculated for different lengths or bitwidths. Since we are choosing the window at compile time, these coefficients are fixed." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// our FIR has parameterized window length, IO bitwidth, and windowing function\n", "class MyFir(length: Int, bitwidth: Int, window: (Int, Int) => Seq[Int]) extends Module {\n", " val io = IO(new Bundle {\n", " val in = Input(UInt(bitwidth.W))\n", " val out = Output(UInt((bitwidth*2+length-1).W)) // expect bit growth, conservative but lazy\n", " })\n", "\n", " // calculate the coefficients using the provided window function, mapping to UInts\n", " val coeffs = window(length, bitwidth).map(_.U)\n", " \n", " // create an array holding the output of the delays\n", " // note: we avoid using a Vec here since we don't need dynamic indexing\n", " val delays = Seq.fill(length)(Wire(UInt(bitwidth.W))).scan(io.in)( (prev: UInt, next: UInt) => {\n", " next := RegNext(prev)\n", " next\n", " })\n", " \n", " // multiply, putting result in \"mults\"\n", " val mults = delays.zip(coeffs).map{ case(delay: UInt, coeff: UInt) => delay * coeff }\n", " \n", " // add up multiplier outputs with bit growth\n", " val result = mults.reduce(_+&_)\n", "\n", " // connect output\n", " io.out := result\n", "}\n", "\n", "visualize(() => new MyFir(7, 12, TriangularWindow))" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Those last three lines could be easily combined into one. Also notice how we've handled bitwidth growth conservatively to avoid loss.\n", "\n", "**Example: FIR Filter Tester**
\n", "Let's test our FIR! Previously, we provided a custom golden model. This time we'll use Breeze, a Scala library of useful linear algebra and signal processing functions, as a golden model for our FIR filter. The code below compares the Chisel output with the golden model output, and any errors cause the tester to fail.\n", "\n", "Try uncommenting the print statment at the end just after the expect call. Also try changing the window from triangular to Hamming." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// math imports\n", "import scala.math.{pow, sin, Pi}\n", "import breeze.signal.{filter, OptOverhang}\n", "import breeze.signal.support.{CanFilter, FIRKernel1D}\n", "import breeze.linalg.DenseVector\n", "\n", "// test parameters\n", "val length = 7\n", "val bitwidth = 12 // must be less than 15, otherwise Int can't represent the data, need BigInt\n", "val window = TriangularWindow\n", "\n", "// test our FIR\n", "test(new MyFir(length, bitwidth, window)) { c =>\n", " \n", " // test data\n", " val n = 100 // input length\n", " val sine_freq = 10\n", " val samp_freq = 100\n", "\n", " // sample data, scale to between 0 and 2^bitwidth\n", " val max_value = pow(2, bitwidth)-1\n", " val sine = (0 until n).map(i => (max_value/2 + max_value/2*sin(2*Pi*sine_freq/samp_freq*i)).toInt)\n", " //println(s\"input = ${sine.toArray.deep.mkString(\", \")}\")\n", "\n", " // coefficients\n", " val coeffs = window(length, bitwidth)\n", " //println(s\"coeffs = ${coeffs.toArray.deep.mkString(\", \")}\")\n", "\n", " // use breeze filter as golden model; need to reverse coefficients\n", " val expected = filter(\n", " DenseVector(sine.toArray),\n", " FIRKernel1D(DenseVector(coeffs.reverse.toArray), 1.0, \"\"),\n", " OptOverhang.None\n", " )\n", " expected.toArray // seems to be necessary\n", " //println(s\"exp_out = ${expected.toArray.deep.mkString(\", \")}\") // this seems to be necessary\n", "\n", " // push data through our FIR and check the result\n", " c.reset.poke(true.B)\n", " c.clock.step(5)\n", " c.reset.poke(false.B)\n", " for (i <- 0 until n) {\n", " c.io.in.poke(sine(i).U)\n", " if (i >= length-1) { // wait for all registers to be initialized since we didn't zero-pad the data\n", " val expectValue = expected(i-length+1)\n", " //println(s\"expected value is $expectValue\")\n", " c.io.out.expect(expected(i-length+1).U)\n", " //println(s\"cycle $i, got ${c.io.out.peek()}, expect ${expected(i-length+1)}\")\n", " }\n", " c.clock.step(1)\n", " }\n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "# Chisel Exercises\n", "Complete the following exercises to practice writing functions, using them as arguments to hardware generators, and avoiding mutable data.\n", "\n", "**Exercise: Neural Network Neuron**
\n", "Our first example will have you build a neuron, the building block of fully-connected layers in artificial neural networks. Neurons take inputs and a set of weights, one per input, and produce one output. The weights and inputs are multiplied and added, and the result is fed through an activation function. In this exercise, you will implement different activation functions and pass them as an argument to your neuron generator.\n", "\n", "![Neuron](https://upload.wikimedia.org/wikipedia/commons/thumb/6/60/ArtificialNeuronModel_english.png/600px-ArtificialNeuronModel_english.png)\n", "\n", "First, complete the following code to create a neuron generator. The parameter `inputs` gives the number of inputs. The parameter `act` is a function that implements the logic of the activation function. We'll make the inputs and outputs 16-bit fixed point values with 8 fractional bits." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "class Neuron(inputs: Int, act: FixedPoint => FixedPoint) extends Module {\n", " val io = IO(new Bundle {\n", " val in = Input(Vec(inputs, FixedPoint(16.W, 8.BP)))\n", " val weights = Input(Vec(inputs, FixedPoint(16.W, 8.BP)))\n", " val out = Output(FixedPoint(16.W, 8.BP))\n", " })\n", " \n", " ???\n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", "\n", "\n", "
\n", "
\n",
    "  val mac = io.in.zip(io.weights).map{ case(a:FixedPoint, b:FixedPoint) => a*b}.reduce(_+_)\n",
    "  io.out := act(mac)\n",
    "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Now let's create some activation functions! We'll use a threshold of zero. Typical activation functions are the sigmoid function and the rectified linear unit (ReLU).\n", "\n", "The sigmoid we'll use is called the [logistic function](https://en.wikipedia.org/wiki/Logistic_function), given by \n", "\n", "$logistic(x) = \\cfrac{1}{1+e^{-\\beta x}}$\n", "\n", "where $\\beta$ is the slope factor. However, computing the exponential function in hardware is quite challenging and expensive. We'll approximate this as the step function.\n", "\n", "$step(x) = \\begin{cases}\n", " 0 & \\text{if } x \\le 0 \\\\\n", " 1 & \\text{if } x \\gt 0\n", " \\end{cases}$\n", "\n", "The second function, the ReLU, is given by a similar formula.\n", "\n", "$relu(x) = \\begin{cases}\n", " 0 & \\text{if } x \\le 0 \\\\\n", " x & \\text{if } x \\gt 0\n", " \\end{cases}$\n", "\n", "Implement these two functions below. You can specify a fixed-point literal like `-3.14.F(8.BP)`. " ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "val Step: FixedPoint => FixedPoint = ???\n", "val ReLU: FixedPoint => FixedPoint = ???" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", "\n", "\n", "
\n", "
\n",
    "val Step: FixedPoint => FixedPoint = x => Mux(x <= 0.F(8.BP), 0.F(8.BP), 1.F(8.BP))\n",
    "val ReLU: FixedPoint => FixedPoint = x => Mux(x <= 0.F(8.BP), 0.F(8.BP), x)\n",
    "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Finally, let's create a tester that checks the correctness of our Neuron. With the step activation function, neurons may be used as logic gate approximators. Proper selection of weights and bias can perform binary functions. We'll test our neuron using AND logic. Complete the following tester to check our neuron with the step function.\n", "\n", "Note that since the circuit is purely combinational, the `reset(5)` and `step(1)` calls are not necessary." ] }, { "cell_type": "code", "execution_count": null, "metadata": {}, "outputs": [], "source": [ "// test our Neuron \n", "test(new Neuron(2, Step)) { c =>\n", " val inputs = Seq(Seq(-1, -1), Seq(-1, 1), Seq(1, -1), Seq(1, 1))\n", "\n", " // make this a sequence of two values\n", " val weights = ???\n", "\n", " // push data through our Neuron and check the result (AND gate)\n", " c.reset.poke(true.B)\n", " c.clock.step(5)\n", " c.reset.poke(false.B)\n", " for (i <- inputs) {\n", " c.io.in(0).poke(i(0).F(8.BP))\n", " c.io.in(1).poke(i(1).F(8.BP))\n", " c.io.weights(0).poke(weights(0).F(16.W, 8.BP))\n", " c.io.weights(1).poke(weights(1).F(16.W, 8.BP))\n", " c.io.out.expect((if (i(0) + i(1) > 0) 1 else 0).F(16.W, 8.BP))\n", " c.clock.step(1)\n", " }\n", " \n", "}" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "
\n", "\n", "\n", "
\n", "
\n",
    "val weights  = Seq(1.0, 1.0)\n",
    "
" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---\n", "# You're done!\n", "\n", "[Return to the top.](#top)" ] } ], "metadata": { "kernelspec": { "display_name": "Scala", "language": "scala", "name": "scala" }, "language_info": { "codemirror_mode": "text/x-scala", "file_extension": ".scala", "mimetype": "text/x-scala", "name": "scala", "nbconvert_exporter": "script", "version": "2.12.10" } }, "nbformat": 4, "nbformat_minor": 2 }