From the Burrow

An out of control reddit post

2015-02-24 09:32:48 +0000

I tried to write a short reply to someone on extending a language with macros…I failed…it got long

This was the thread:

[note: this assumes you’re a lisp newbie, sorry if this is patronising] Is it what ‘THEY’ mean when they say Lisp allows you to invent your own language. No, not really. But it is nice to be able to use symbols normally deemed off limits right? But syntax is part of language so what you are doing is in the sphere of language. Let’s have a look at a macro:

(defmacro fn^ (name args &body body)
  `(defun ,name ,args

Ok so first let’s see what it does and then how it does it.

A macro is a function that runs at a different time, weird right?! It the above kind is run before your program is compiled. the arguments will be source code and the return value will be new code to go in place of the old code. So this:

(fn^ some-func (x) (* x 10))


(defun some-func (x) (* x 10))

See what happened there? The macro was give the following arguments: name -> some-func args -> (x) body -> ((* x 10))

if not you may want to read up on ‘backquote syntax’, it is (aproximately) a very tidy way of splicing lists and data together.

Ok so now we have functions and this kind of macro (yes there are other kinds) we can image that we redefine every common contruct in common lisp.

lets say: let becomes val<-, make-hash-table becomes hsh^,

(f^ fnc (%) (val<- ((x (hsh^)) (y :dflt)) (<> % x)))

Well it’s clearly still lisp but it’s very unfamilar (and ugly! :D). While this may affect language design it doesnt feel like a new language. Let’s do something else with these macros

(defmacro val<- (&rest form)
  (let* ((body (last form))
         (bindings (butlast form))
         (b (loop for i below (1- (length bindings)) by 2 collect
                 (list (elt bindings i) (elt bindings (1+ i))))))
    `(let ,b

so now

(val<- ((x (hsh^)) (y :dflt))
       (<> % x))

could be writen as

(val<- x (hsh^) y :dflt
       (<> % x))

OK! so now this is different. This feels more like changing the language as we messing with how the code behaves. The issue is…now we are messing with language we not only have the normal problems a programmer deals with but also with those of a language designer. These are issues of feel and experience as well as functionality (it’s very like api design but with much deeper implications), the answers are often subjective and based on what the language is designed to solve. A big one for me is ‘Does the programmer have to think more to acheive something using this’.

In my own project I use macros to make defining glsl shaders feel like defining regular lisp functions.

(defshader some-name ((x :int))
  (+ x 10))

And translate the lisp code into glsl for the programmer.

We have only scratched the surface of what we can do in this little blogpost. There are also other kinds of macros, like reader macros for example. Reader macros are function that run before the macros we have seen already. They don’t recieve the code as lists of symbols, they recieve the actual characters from the code you wrote. They can allow for cool and often crazy stuff, all the of literal syntax you see around lisp is made usign reader macros: ‘ ` , #() . etc are all made possible using reader macros, and you can extend that in any way you can concieve.

I hope this was of some use. Sorry it turned into a whole post rather than a comment!

p.s. Also check out: <- macros give CL pattern matching <- Info on reader macros Look symbol-macrolet <- this type of macro swaps out a symbol for another whole form

Cepl Video Update

2015-02-04 23:18:58 +0000

Just stuck a new video on youtube showing a bit of multipass rendering, normal mapping and refraction.

What a fucking suprise

2015-01-16 17:10:56 +0000

RE: Your Content Feed Is Broken

2015-01-16 09:33:18 +0000

“Our Point of Acknowledgment is the moment when we (the user) understand what we are viewing and can then recall that this information has been consumed. It is a continuous moving variable that helps us track where we are in time.”

Time tracking doesn’t require acknowledging every point though, just enough discrete points that we get a sense of moving forward, coupled with the critical assurance that the points will not rearrange themselves.

Movie and book analogies are broken, both experiences are designed knowing the content volume is fixed (designed is a very generous term if you are dealing with dvd/blueray UIs .. or most devices for hard media for that matter)

Interesting concept but naive implementation just optimizes for a different view habit, the author acknowledges this by saying: “Each feed will need to choose an appropriate location to place new users.” - this is a non-trivial problem, deciding a user’s content priority is hard, if you are picking one and dictating then we are just setting things up so that another well intentioned blog post like this will come up saying ‘wouldn’t it be great to have our media presented like X’

“whining as loud as the rest of you” - Dismissing issues as whining is a massive fallacy. User experience is not just design, it isn’t just entry points, it covers the gamut of ways the user and the medium interact; experience is not a one way phenomenon.

How you treat people’s data is part of experience as if you fuck up they experience it. Whether it is losing credit card details or leaking browsing habits, the experience lives beyond a page view if you choose to retain data.

The fundamental we are dealing with, once again is, ‘how do I track my position in a world changing to rapidly to comprehend alone?’.

This question remains open and is hard, I believe these problems are solvable with going down the road of companies/websites/etc saying ‘well with just one more bit of data we can make your life better, wait…no just two more bits’.

Eval into file

2015-01-08 00:28:51 +0000

Just wrote a handy function for working with slime.

  • It takes the top level form at that point e.g. (make-instance 'test)
  • Wraps it in a defparameter with a new var name (defparameter <iv-0> (make-instance 'test))
  • evals it using slime and injects the var name (in this case <iv-0>) into the file just after the toplevel form

This is useful when sketching out ideas in a file and you want to compile something but also capture the result in a global for messing with later

running it 3 times would give you something like

(make-instance 'test)
<iv-2> <iv-1> <iv-0>

which makes it very easy to wrap a list around and throw it where you see fit.

(defvar eval-into-file-count 0)
(defun slime-eval-into-file ()
  "Evaluate the current toplevel form.
store the result in a new global and insert the 
var into the code"
  (let* ((form (slime-defun-at-point))
         (var-name (concat "<iv-" (number-to-string eval-into-file-count) ">"))
         (form-with-var (concat "(defparameter " var-name form ")")))
    (setq eval-into-file-count (+ eval-into-file-count 1))
    (slime-eval-async `(swank:eval-and-grab-output ,form-with-var)
      (lambda (result)
        (cl-destructuring-bind (output value) result
          (insert value " "))))))


Hither and thither

2014-12-04 23:38:30 +0000

Ah one thing I need to nail before geometry shaders is better support for FBOs as I remembered I need a way to draw to mutiple textures from one shader.

To out or not to out

2014-12-03 23:49:36 +0000

(defpipeline prog-1 ((vert vert-data))
  (:vertex (setf gl-position (pos vert))
           (out (the-color :smooth) (col vert)))
  (:fragment (let ((lerp-value (/ (y gl-frag-coord) 500.0)))
               (out outputColor (mix the-color 
                                     (v! 0.2 0.2 0.2 1.0)

Currently setting the out vars of a shader stage is done with the out special form. The above code compiles to

// vertex shader
#version 330

layout(location = 0) in vec4 fk_vert_position;
layout(location = 1) in vec4 fk_vert_colour;
smooth out vec4 the_color;

void main() {
    gl_Position = fk_vert_position;
    the_color = fk_vert_colour;

// fragment shader
#version 330

smooth in vec4 the_color;
out vec4 outputcolor;

void main() {
    float lerp_value_4v = (gl_FragCoord.y / 500.0f);
    outputcolor = mix(the_color,vec4(0.2f,0.2f,0.2f,1.0f),lerp_value_4v);

Notice how the out form essentially compiles to a global var and a setf.

In the quest for lispyness, should this stay as it is? There isn’t anything wrong with setf per se, however it is worth investigating other forms to see if anything surprises us…and by us I mean me :)

Here is a version with a version of values that allows naming

(defpipeline prog-1 ((vert vert-data))
  (:vertex (values (gl-position (pos vert))
                   ((the-color :smooth) (col vert))))
   (values (outputColor
            (let ((lerp-value (/ (y gl-frag-coord) 500.0)))
              (mix the-color 
                   (v! 0.2 0.2 0.2 1.0)

Eh..the vertex shader is ok but this introduces an extra level of indentation that looks ugly

Ok so how about setf style? Setf allows a name-value style to set multiple variables

(setf a 1
      b 2)

So for the shaders that is…

(defpipeline prog-1 ((vert vert-data))
  (:vertex (values gl-position (pos vert)
                   (the-color :smooth) (col vert)))
    (let ((lerp-value (/ (y gl-frag-coord) 500.0)))
      (mix the-color 
           (v! 0.2 0.2 0.2 1.0)

.. nope, doesn’t feel better aesthetically and now makes identifying the val and name forms more difficult.

Declare style?

(defpipeline prog-1 ((vert vert-data))
   (declare (out gl-position (the-color :smooth)))
   (values (pos vert) (col vert)))
   (declare (out outputColor))
   (values (let ((lerp-value (/ (y gl-frag-coord) 500.0)))
             (mix the-color 
                  (v! 0.2 0.2 0.2 1.0)

..Interesting, yesterday I was looking at declare for types, so if that was the case then using this would feel natural.

Lets see what happens if the types are in declare too

(defpipeline prog-1 (vert) (:vertex (declare (vert-data vert) (out gl-position (the-color :smooth))) (values (pos vert) (col vert))) (:fragment (declare (out outputColor)) (values (let ((lerp-value (/ (y gl-frag-coord) 500.0))) (mix the-color (v! 0.2 0.2 0.2 1.0) lerp-value)))))

Haha, not much as almost everything here was already being done with inference.

There is still an issue though that is best summed up in Erik Naggum’s quote on c++. For my purposes the fact that this is about c++ is irrelevant.

C++ is philosophically and cognitively unsound as it forces a violation
of all known epistemological processes on the programmer.  as a language,
it requires you to specify in great detail what you do not know in order
to obtain the experience necessary to learn it. -- Erik Naggum

Declaring types in advanced forces the programmer to make an assertion about the nature of the code not yet written. This violates the concept of live coding as we are coding, not because we have the solution, but because we are looking for one.

Now obviously with coding in the restricted environment of the GPU types are currently a necessity so the best we can do is minimize the cognitive damage. Other than having inference everywhere we possibly can, the other way is to delay specifying the type until it is actually needed, in most cases with the arg definition.

Hmm so no progress on this but I’m glad I’ve been able to empty my head a little.


Sane defaults for Uniforms

2014-12-03 08:24:49 +0000

OK so at the end of the last post I started musing about having sensible defaults for uniforms values. The logic behind this follows:

So we have a shader ‘vert’

(defvshader vert ((position :vec4) &uniform (i :int) (loop :float))
  (let ((pos (v! (* (s~ position :xyz) 0.3) 1.0)))
    (setf gl-position (+ pos (calc-offset (+ (float i)) loop)) )))

And now I want to add a new uniform: (thing :int)

So do I add the new argument to vert, compile and it freaks out as it has no value? Or do I add it to the calls to vert first and watch it freak out as the shader doesn’t take this?

Both modes suck, and it obviously makes sense to modify the shader first, but on first compile have some default value that won’t cause the shader to panic.

This is easy for numbers, vectors and matrices as all have a concept of an identity, but what about structs?

Well all gl-structs ultimately have to be composed of types glsl understands, therefore we could have the defglstruct method generate an identity value we can use.

What about textures? Do we create a default unit texture for each sampler type? Could do. And we don’t have to worry about ‘index out of bounds’ as by default indexes just wrap for textures. (this is not true for texelFetch, the result is undefined, but shouldnt crash so will but enough time to provide real values)

according to here:

"The only limit of glGenTextures is given by the bit width of the texture name (GLint), which is 32 bit;"

Which means one (1x1..n) texture for each of the 18 sampler types shouldn’t be an issue.

Now what about uniform buffers? I guess I will want something similar to gl-structs, but as I haven’t implemented UBOs yet I won’t worry about it until I am.

Right, back soon!


Soon there will be a new Stage

2014-12-02 23:50:22 +0000

After having merged in all the nice changes to foreign arrays I have decided to start looking at what is probably the biggest hole in the glsl compiler right now, Geometry Shaders.

I’m beavering away on some refactoring and code cleanup to make this tractable, which is going rather well. There should be no major changes to the compiler needed as I added all the geometry shader specific functions ages ago. I think it will mainly be glsl ‘out’ variable type transformations. I will also need to be able to specify the geometry type (tris, points, lines) in the context as that logic will be needed for proving correctness in the geometry stage.

I am not too fussed about adding Tessellation shaders urgently, mainly because I haven’t learned how to use them yet.

I am also musing over the idea of supporting the use of ‘declare’ to specify shader argument types and the context. So that the following

(defvshader test ((verts pnc) &uniform (i :int) (origin :vec3)
                  (cam-to-world :mat4) &context :version 330)
  ...shader code here)

Would instead look like:

(defvshader test (verts &uniform i origin cam-to-world)
  (declare (pnc verts) (:int i) (:vec3 origin)
           (:mat4 cam-to-world) (context :version 330))
  ...shader code here)

Which is more lispy…but as types are mandatory I’m not sure how I feel about this. Does this also mean I specify ‘out’ variables here?

** NOTE: At the start of this post I knew what I was going to blog about, but now I am off on a mental wander. So sorry for the round about nature of the rest of the post :) **

It does make the main signature clearer, especially where array types are concerned.

(i (:int 3))

The above is the current signature for an array of 3 ints, get a few of these in the argument list and it gets messy.

But this also means I will have to do the same for labels…

(labels ((test ((i :int) (y (:int 3)))
           (+ i (aref y 0))))
  (test 1 some-int-array))


(labels ((test (i y)
           (declare (:int i) ((:int 3) y))
           (+ i (aref y 0))))
  (test 1 some-int-array))

Hmm, not too bad, the first example is a jumble… how about ‘let’ forms?

(let ((x 1)
      ((y :int) 2)
      ((z (:int 3)) some-array))
  (some-func x y z))

Notice how the types are optional as they can be inferred, so in declare style

(let ((x 1)
      (y 2)
      (z some-array))
  (declare (:int y) ((:int 3) some-array))
  (some-func x y z))

Ok now in this case it is a BIG improvement to the actual readability of the declarations themselves.

The big worry for me though is giving you two places to look for one piece of information when reading the code. What you want to know is “What is x” and you have to look at for it’s declaration (so you can know it’s scope) and the declare (to know it’s type).

The other possibility is to be super un-common-lispy and use some special reader syntax that is ONLY valid in bindings.

(labels ((test ([i :int] [y (:int 3)])
           (+ i (aref y 0))))
  (test 1 some-int-array))

(let ((x 1)
      ([y :int] 2)
      ([z (:int 3)] some-array))
  (some-func x y z))

Ugh…even writing it down feels wrong. Nah, scrap this idea. I think the declare style is growing on me but I will need to give this some time. I’d appreciate any ideas on this one.


p.s. But wait, one other advantage of declare form, it allows us to have default values for uniforms args. Hmm, this quite possibly means more state changes, as every unspecified uniform must take the defaut value, whereas currently uniforms are memoized. Ok so bad idea. I should probably have the concept of sane defaults though. I should move this to a new post :)

It's Just Pretty

2014-11-25 00:21:00 +0000

First some code!:

> (setf c (make-c-array #(1 2 3 4)))
#<C-ARRAY :element-type :UBYTE :dimensions (4)>

> (setf g (make-gpu-array #(1 2 3 4)))
#<GPU-ARRAY :element-type :UBYTE :dimensions (4) :backed-by :BUFFER>

> (setf g (make-texture #(1 2 3 4)))
#<GL-TEXTURE-1D (4)>

> (texref *)
#<GPU-ARRAY :element-type :R8 :dimensions (4) :backed-by :TEXTURE>

I have tweaked the make-*-array and make-*-texture commands so that, if you only pass in a lisp data structure, cepl will try and find a the smallest gl type that can be used across the whole data-set.

For example:

(make-c-array #(1 2 3 4)) ;; <- element type will be :ubyte

(make-c-array #(1 -2 3 4)) ;; <- element type will be :byte

(make-c-array #(1 -2 200 4)) ;; <- element type will be :int

(make-c-array #(1 -2 200 4.0)) ;; <- element type will be :float

(make-c-array #(1 -2 200 4.0d0)) ;; <- element type will be :double

This is one of those nice cases where you get to make repl exploration more fun with no cost to speed. The reason for this is no high level code will be using this inference as it would be stunningly stupid to pay that cost for no reason. All file loading (which is the most common case) will have the type name anyway so it can just be passed in as usual.

Little details, but pleasant ones

p.s. This is in a branch for now as I am hunting down some issues I have introduced while modifying the indexing functions