From the Burrow

Machine Readable GLSL Spec

2015-06-22 08:27:04 +0000

TLDR: Look Here

There was one thing during making my glsl compiler that was terrible; One thing worse than any of the bugs I have hammered out so far.

The GLSL Spec

The spec is available either as pdf or html. It is designed for humans to read but I needed my program to know about every function in glsl. To that end I spent days processing the contents of the pdf, and some other open code, in order to get a list of all the functions with their return & argument types.

Since then I have wanted to update all the functions with information on which versions they are available from, but this has seemed a daunting undertaking.

Just the other day however, some people have put together a new gl documentation site http://docs.gl, whilst the information itself still is geared to be human readable it is at least better laid out and stored in a github repo.

I took a few hours over the weekend to knock up a script that would extract the basic info I need from the glsl spec. Potentially I could do the opengl one too, but for now that is not urgent. You can find the results here.

The data is one big s-expression which contains all the glsl functions and variables.

#Layout

each element looks something like this

(((("EmitStreamVertex" "void") (("stream" "int"))))
  (("EmitStreamVertex" NIL NIL NIL NIL NIL NIL :400 :410 :420 :430
  :440 :450)))

Each item has two elements:

  • A list of function definitions
  • A list of information on which version are supported

Function definition

Each function definitions is laid out as follows:

  • The first element is a pair of the function name as a string and the return type as a string.
  • The rest of the list are pairs of the argument name as a string and the argument type as a string

Version Information

Each element in the version info list is laid out as follows:

  • The first element is the name of the function and in some cases enough arg info to differentiate it from other incarnations of the same function. This is one string which sucks, so I need to parse this so we can apply the version info directly.
  • The rest of the elements are the versions supported. It is either nil or a keyword specifying the version. The nils are becuase the orginal tables were rows of ticks of dashes to indicate versions supported. I turned the ticks into the version keyword and the dashes into nil. Really I need to remove the nils and I will do that very soon.

GLSL Variables

In the event that the function definition list is nil then the entry is a variable. You will find it’s name as the first element in the version table along with the versions that support it.

Right I hope this helps. This will be in a state of flux for a while as I also want to extract all documentation for each entry in glsl and I’m sure there will be tweaks to be made.

I Hope it helps someone else.

New videos coming soon!

2015-06-12 09:40:59 +0000

Last week was a good week for making videos, and cepl in general. With the big merge, primitive osx support and sampler objects it has been feeling like cepl is progressing very quickly all of a sudden.

I have had to throw away a couple of videos I was going to release however, as they are now very out of date. I had hoped to get full blend-mode support in last night but as I failed at that it will likely be merged in in the next 3 days. The next videos in the pipeline will be on sampler-objects and blend-modes, I will the re-record my “Intro to CEPL for people who know GL”, as this was one of the casualties of the last week’s progress.

Thanks to anyone who is reading this, Ciao

Sampler Objects

2015-06-11 11:09:24 +0000

Textures in opengl have parameters that affect how they are sampled. They dictate what happens when they are magnified or minified, what happens if you sample outside the texture etc.

Sampler objects allow you to override the sampling params in a texture. One sampler object can be used on multiple textures in a single draw call.

As of last night these are now available in cepl if (>= (version-float *gl-context*) 3.3)).

You can use them like this

(defvar sampler (make-sampler :wrap #(:repeat :clamp-to-edge :repeat)))

(with-sampling ((tex sampler))
  (map-g #'some-pipeline some-stream :tx tex))

With sampling overrides a texture’s sampling params with the sampler object for the duration of the scope. The texture then reverts to it’s own sampling parameters.

The functions that are used on samplers to change their parameters also work on textures so:

(setf (lod-bias some-texture) 0.5)
(setf (lod-bias some-sampler) 0.5)

are both valid.

Next up are blend-modes. I have had to hold off making new videos for a few days as they are going out of date so fast as we keep getting new features.

OSX SUPPORT WOOO!

2015-06-05 08:58:14 +0000

IT FINALLY FUCKING WORKS!

Video coming saturday explaining in detail how to get set up on osx but the tldr is:

  • run osx-sbcl-launch.sh from the terminal
  • slime-connect
  • (in-package :cepl)

Back soon!

Progress

2015-03-10 02:22:53 +0000

It’s buggy as hell right now but I’ve been working on a new way (for cepl) to write and compose shaders.

You now write gpu functions and compose them using defpipeline, the gpu functions can be used as stages or as regular functions and varjo will compile them correctly for each task. We also have lost explicit return in favour of implicit tail returns, just like in regular common lisp.

A quick code sample

(defun-g v ((vert g-pc))
  (values (v! (cgl:pos vert) 1.0)
          (:smooth (cgl:col vert))))

(defun-g f ((color :vec4))
  color)

(defpipeline prog-1 (g-> v f))

Compiles to

// vertex shader
#version 330

layout(location = 0) in vec3 fk_vert_position;
layout(location = 1) in vec4 fk_vert_color;

smooth out vec4 out_86v;

void main() {
    vec4 return1;
    vec4 v_tmp_83v_84v = vec4(fk_vert_position,1.0f);
    return1 = fk_vert_color;
    gl_Position = v_tmp_83v_84v;
    out_86v = return1;
}

// fragment shader
#version 330

in vec4 out_86v;

layout(location = 0) out vec4 output_color_87v;

void main() {
    output_color_87v = out_86v;
}

An out of control reddit post

2015-02-24 09:32:48 +0000

I tried to write a short reply to someone on extending a language with macros…I failed…it got long

This was the thread: http://www.reddit.com/r/lisp/comments/2wy18r/is_this_what_people_mean_when_they_say_lisp/


[note: this assumes you’re a lisp newbie, sorry if this is patronising] Is it what ‘THEY’ mean when they say Lisp allows you to invent your own language. No, not really. But it is nice to be able to use symbols normally deemed off limits right? But syntax is part of language so what you are doing is in the sphere of language. Let’s have a look at a macro:

(defmacro fn^ (name args &body body)
  `(defun ,name ,args
     ,@body))

Ok so first let’s see what it does and then how it does it.

A macro is a function that runs at a different time, weird right?! It the above kind is run before your program is compiled. the arguments will be source code and the return value will be new code to go in place of the old code. So this:

(fn^ some-func (x) (* x 10))

-becomes-

(defun some-func (x) (* x 10))

See what happened there? The macro was give the following arguments: name -> some-func args -> (x) body -> ((* x 10))

if not you may want to read up on ‘backquote syntax’, it is (aproximately) a very tidy way of splicing lists and data together.

Ok so now we have functions and this kind of macro (yes there are other kinds) we can image that we redefine every common contruct in common lisp.

lets say: let becomes val<-, make-hash-table becomes hsh^,

(f^ fnc (%) (val<- ((x (hsh^)) (y :dflt)) (<> % x)))

Well it’s clearly still lisp but it’s very unfamilar (and ugly! :D). While this may affect language design it doesnt feel like a new language. Let’s do something else with these macros

(defmacro val<- (&rest form)
  (let* ((body (last form))
         (bindings (butlast form))
         (b (loop for i below (1- (length bindings)) by 2 collect
                 (list (elt bindings i) (elt bindings (1+ i))))))
    `(let ,b
       ,@body)))

so now

(val<- ((x (hsh^)) (y :dflt))
       (<> % x))

could be writen as

(val<- x (hsh^) y :dflt
       (<> % x))

OK! so now this is different. This feels more like changing the language as we messing with how the code behaves. The issue is…now we are messing with language we not only have the normal problems a programmer deals with but also with those of a language designer. These are issues of feel and experience as well as functionality (it’s very like api design but with much deeper implications), the answers are often subjective and based on what the language is designed to solve. A big one for me is ‘Does the programmer have to think more to acheive something using this’.

In my own project I use macros to make defining glsl shaders feel like defining regular lisp functions.

(defshader some-name ((x :int))
  (+ x 10))

And translate the lisp code into glsl for the programmer.

We have only scratched the surface of what we can do in this little blogpost. There are also other kinds of macros, like reader macros for example. Reader macros are function that run before the macros we have seen already. They don’t recieve the code as lists of symbols, they recieve the actual characters from the code you wrote. They can allow for cool and often crazy stuff, all the of literal syntax you see around lisp is made usign reader macros: ‘ ` , #() . etc are all made possible using reader macros, and you can extend that in any way you can concieve.

I hope this was of some use. Sorry it turned into a whole post rather than a comment!

p.s. Also check out: http://enthusiasm.cozy.org/archives/2013/07/optima <- macros give CL pattern matching https://gist.github.com/chaitanyagupta/9324402 <- Info on reader macros Look symbol-macrolet <- this type of macro swaps out a symbol for another whole form

Cepl Video Update

2015-02-04 23:18:58 +0000

Just stuck a new video on youtube showing a bit of multipass rendering, normal mapping and refraction.

What a fucking suprise

2015-01-16 17:10:56 +0000

RE: Your Content Feed Is Broken

2015-01-16 09:33:18 +0000

https://medium.com/@tyrale/your-content-feed-is-broken-f8c6576077c2

“Our Point of Acknowledgment is the moment when we (the user) understand what we are viewing and can then recall that this information has been consumed. It is a continuous moving variable that helps us track where we are in time.”

Time tracking doesn’t require acknowledging every point though, just enough discrete points that we get a sense of moving forward, coupled with the critical assurance that the points will not rearrange themselves.

Movie and book analogies are broken, both experiences are designed knowing the content volume is fixed (designed is a very generous term if you are dealing with dvd/blueray UIs .. or most devices for hard media for that matter)

Interesting concept but naive implementation just optimizes for a different view habit, the author acknowledges this by saying: “Each feed will need to choose an appropriate location to place new users.” - this is a non-trivial problem, deciding a user’s content priority is hard, if you are picking one and dictating then we are just setting things up so that another well intentioned blog post like this will come up saying ‘wouldn’t it be great to have our media presented like X’

“whining as loud as the rest of you” - Dismissing issues as whining is a massive fallacy. User experience is not just design, it isn’t just entry points, it covers the gamut of ways the user and the medium interact; experience is not a one way phenomenon.

How you treat people’s data is part of experience as if you fuck up they experience it. Whether it is losing credit card details or leaking browsing habits, the experience lives beyond a page view if you choose to retain data.

The fundamental we are dealing with, once again is, ‘how do I track my position in a world changing to rapidly to comprehend alone?’.

This question remains open and is hard, I believe these problems are solvable with going down the road of companies/websites/etc saying ‘well with just one more bit of data we can make your life better, wait…no just two more bits’.

Eval into file

2015-01-08 00:28:51 +0000

Just wrote a handy function for working with slime.

  • It takes the top level form at that point e.g. (make-instance 'test)
  • Wraps it in a defparameter with a new var name (defparameter <iv-0> (make-instance 'test))
  • evals it using slime and injects the var name (in this case <iv-0>) into the file just after the toplevel form

This is useful when sketching out ideas in a file and you want to compile something but also capture the result in a global for messing with later

running it 3 times would give you something like

(make-instance 'test)
<iv-2> <iv-1> <iv-0>

which makes it very easy to wrap a list around and throw it where you see fit.

(defvar eval-into-file-count 0)
(defun slime-eval-into-file ()
  "Evaluate the current toplevel form.
store the result in a new global and insert the 
var into the code"
  (interactive)
  (let* ((form (slime-defun-at-point))
         (var-name (concat "<iv-" (number-to-string eval-into-file-count) ">"))
         (form-with-var (concat "(defparameter " var-name form ")")))
    (setq eval-into-file-count (+ eval-into-file-count 1))
    (end-of-defun)
    (slime-eval-async `(swank:eval-and-grab-output ,form-with-var)
      (lambda (result)
        (cl-destructuring-bind (output value) result
          (push-mark)
          (insert value " "))))))

Ciao

Mastodon