Let's Design and Build a (mostly) Digital Theremin!

Posted: 6/15/2012 2:29:35 PM

From: Northern NJ, USA

Joined: 2/17/2012

"Yeah - those articles made me feel old and stupid! Like the DSP stuff in Hal's book - I can understand it, but am not able to visualize it.. When I am dealing with circuits, I can "see" the flow of electrons and "hear" the sounds, and am able to change things and 'invent' other ways to do it.." - FredM

There's lots of stuff I can't visualize without assistance.  For logic design I rely heavily on spreadsheets and verilog simulators.  For analog I turn to spice when the circuit is non-standard or too complex.

Doing a simple integrator followed by a differentiator in a spreadsheet can be instructive, you get out exactly what you put in.  Then move on to a simple low pass filter (see Chamberlin) and feed it a sine wave.  Even with a first order filter you will see issues with truncating the accumulator value as feedback, and truncation of the input that comes with the attenuation required to place the cutoff point.  Do a second order and you will see this in spades, along with accumulator overflow, limit cycles, etc.  Dattorro's filtering paper shows how to deal with all this using a 16/32 bit processor.  He uses modulo arithmetic in the accumulator, which allows much larger accumulation values than 32 bits gives you, but if it doesn't "unwind" back into the 32 bit range at the end you can get oscillations that will need to be squashed via an external mechanism.  He also deals with accumulator truncation noise by taking the error energy and feeding it back into the accumulator through a simple filter.  It's kind of weird, but if the energy can get out somehow it often reduces distortion - modulators work this way too.

I've only done a bit of filtering in C++, and haven't done any audio filtering in FPGAs, so I'm probably not the best authority on this stuff.  But there are some great papers out there, and Chamberlin is really good at demystifying it.  I kind of blame academia for making everything seem overly complex when it usually isn't.  I suppose they have to look as smart as possible when publishing, but many of them seem more interested in the mathematical beauty than the actual application of what they know -  many of my college profs were this way (signals and systems, communications, microwaves, devices, etc.).  Showing the student a pile of equations right out of the starting gate (and then testing them on it!) is an absolutely terrible way to teach/learn a subject.  Whatever math is necessary should always follow the intuitive grasp.

IIRC, I heard through a friend that worked in the synth industry that Dattorro got into hot water for revealing too much in that filtering paper.  Industry secrecy is yet another huge stumbling block for those of us trying to understand these things.

Posted: 6/15/2012 3:04:20 PM

From: Northern NJ, USA

Joined: 2/17/2012

"Well, yes, a fixed number of global variables that are constantly being hijacked for any and every purpose. Eek. So what do you do to maintain their integrity and make them local variables, useable for recursive and reentrant code? Push them onto a stack!" - GordonC

Wow Gordon, it's kind of a shock to find someone as knowledgeable and experienced as you are about Forth here at TW - these are both pretty small interest groups.

But yes, stack frames end up on data stacks.  But maybe things are a bit more human readable before they go there?  And one don't always have to back up all of the registers, particularly in more modest software implementations.  And stack faults are catastrophic.

Before designing my first processor I read too many positive things about stack machines, which set my expectations too high I suppose.  Now I'm trying to make or find something that is efficient in both code size and speed, and manageable without too many external tools (assemblers, compilers, simulators, emulators, etc.).  Maybe that's me being unrealistic?

Posted: 6/15/2012 4:48:35 PM

From: Croxley Green, Hertfordshire, UK

Joined: 10/5/2005

I don't know - it seems to me there are a lot of parallels between Forth and theremins. I've always been attracted to minority interests. :-)

I  think we covered the Forth not being human readable aspect. It's kind of up there with APL (*). It does get easier with practice, but it is the main reason I developed my little 3D flowcharts.

In some ways, catastrophic faults are the best kind. You know when it went wrong! Preferable to insidious little data corruptions that surface occasionally.

Stack Machines could be great - if they had the backing of Intel or Motorola or similar, but instead they are developed by people like Chuck Moore. Genius that he is, there's still a reason that his consulting company was called Computer Cowboys. Nonetheless, here is a downloadable copy of the definitive tome. It makes interesting reading. (First link on page. Rest of page interesting too.)

(Blimey! I just looked on Amazon. The original hardback is going for £273.95. I think I paid a tenth of that when it came out.)

(*) Conway's Game of Life in APL.

Posted: 6/15/2012 9:47:16 PM

From: Northern NJ, USA

Joined: 2/17/2012

"In some ways, catastrophic faults are the best kind. You know when it went wrong! Preferable to insidious little data corruptions that surface occasionally." -- GordonC

I imagine some stack faults might happen slowly, rather like conventional memory leaks, and thus be quite difficult to track down?

"Stack Machines could be great - if they had the backing of Intel or Motorola or similar, but instead they are developed by people like Chuck Moore. Genius that he is, there's still a reason that his consulting company was called Computer Cowboys."

I do kind of understand how we got to where we are today.  Set out to design one and after one or two decisions you find yourself on very well-worn roads.  It's probably easier to target conventional register machines with higher level compilers.  Though you'd think engineers would start looking around the room long before they hit the billion transistors per processor mark, it's kind of ludicrous.  I remember buying my first 286 motherboard thinking I could use the hardware to do all kinds of useful things, and then finding out what a dog the instruction set and architecture was, and with an overly complex operating system hogging everything.  Things have only gotten exponentially worse.

"Nonetheless, here is a downloadable copy of the definitive tome. It makes interesting reading. (First link on page. Rest of page interesting too.)"

Thanks!  I read that years ago when I stumbled upon it on the web.  More authors should put their older books out there for free.

Posted: 6/16/2012 4:31:00 AM

From: Eastleigh, Hampshire, U.K. ................................... Fred Mundell. ................................... Electronics Engineer. (Primarily Analogue) .. CV Synths 1974-1980 .. Theremin developer 2007 to present .. soon to be Developing / Trading as WaveCrafter.com . ...................................

Joined: 12/7/2007

You guys are way beyond me here .. but I completely agree with the " finding out what a dog the instruction set and architecture was, and with an overly complex operating system hogging everything" - particularly the OS bit..

But I also wonder if the path was, in many ways, determined by the requirement for lots of programmers to handle the increasing workload - Register machines are probably a lot easier to grasp than stack-based. Even simple stack operations cause most programmers I have worked with to sweat!

And the OS complexity - this was probably inevitable as more dumb users got their hands on computers.. Even "engineers" these days seem to opt for "solutions" which are packaged with pre-configured bloatware.. I have seen complex RTOS installed in small MCU projects where all that was needed was a few lines of assembler, and nothing remotely approaching an OS was needed.. But these days there is enough spare capacity in even the smallest MCU, that arguments about trimming the code are facile - unless one is doing high speed real-time stuff.

I think Gordons comparison of theremins and Forth is spot on - Minority interest - But also, interest in ideas / products which are great, but for some (sometimes extremely good) reasons never became widely adopted or popular.

And I also think this is a major motivator for designers like you and me - We can see the greatness in ideas like the theremin, and perhaps we think we can see improvements which would make the instrument more widely popular.. But, Like Forth, how much can you "improve" it before it becomes a new language? How many improvements and changes can one make to the theremin before you have created a new instrument?

Fred.

Posted: 6/16/2012 6:39:46 AM

From: Croxley Green, Hertfordshire, UK

Joined: 10/5/2005

Like Forth, how much can you "improve" it before it becomes a new language? -- FredM

"What is Forth?" debates are pretty much like "What is a theremin" debates - the closer you look at it, the less sure you are what it is!

And, plenty of people with experience of other languages come to Forth and think "I can fix that! What it needs is (for example) local variables and infix maths." (Does that sound familiar, Fred?)

Both these things are easy to achieve in a very basic, inefficient way. Getting the sort of efficiencies that hand-crafted Forth can achieve takes quite a lot more effort, building function specific optimising compilers to (for example) figure out the stack manipulations necessary to give the illusion of local variables while actually just compiling stack manipulation operators. And by the time you have done that you have (1) overcome your fear of stack juggling and (2) realised that you are throwing the baby out with the bathwater.

I have reminded myself of a little story from my college days. The assignment was to write a COBOL program. Looking at it, it was a maths heavy task clearly better suited to FORTRAN, which I was also learning. So I did what any sensible person would do, and wrote a completely standard COBOL program which made use of COBOLs ability to "declare FORTRAN" at the top of the program, and then I just inserted the FORTRAN program I had written and debugged (actually I did it in BASIC on an Apple ][ and translated it - rather than waiting for the Fortran complier to batch process it on the Prime mainframe, which took hours) right there into the COBOL program. In terms of getting the program to run most correctly and developing it in the shortest time, I should have got top marks, but apparently you don't get points for being a smarty-pants and circumnavigating the point of the assignment. I was robbed, I tells you, robbed!

Posted: 6/16/2012 6:54:15 AM

From: Croxley Green, Hertfordshire, UK

Joined: 10/5/2005

Footnote - while I was typing that, this appeared on my computer. (I have a twitter search for "theremin" on an RSS feed, so if anyone tweets about theremins, I get to see it.)

"You think you're stupid? I'm redesigning the dumbest instrument ever made, a 60's theremin whilst twinkpuffing ! pic.twitter.com/9zrYIT4h"

Hahahahaha!

(BTW - you don't want to know what twinkpuffing is. Just accept that it's rude and move on.)

Posted: 6/16/2012 2:32:32 PM

From: Northern NJ, USA

Joined: 2/17/2012

"...ideas / products which are great, but for some (sometimes extremely good) reasons never became widely adopted or popular."  -- FredM

Forth has some interesting things going for it besides stacks and postfix.  It's kind of a language and OS in one.  The tiny footprint allows developing and compiling on the target, which is really neat.  Development code used primarily to exercise the hardware in the lab ("toy" code) can be used in the final software.  IIRC user defined functions are held in a linked list, allowing multiple definitions and redefinition on the fly.  Like Java, the target is a virtual machine, which defines the functions and limits of the language.  I used to think this was a good thing, but after almost daily Java updates I'm not so sure :).

One of the things that possibly holds it back commercially is lack of control over the source - can't have your average Joe getting his peepers on that golden Microsoft code.  Another is the very different coding paradigm which requires a somewhat different skill set from the programmer.  For me personally I'm kind of repulsed by all but the most basic of its syntax.

Forth is something very different and practical that's on the fringe.  Being on the fringe, it doesn't get addressed much by the mainstream, so there aren't many technical comparisons (speed, code density, etc.) to more conventional languages, or detractors for that matter.  The Forth noob encounters a situation rather like serving on a jury and hearing only the defendant's case.

Posted: 6/16/2012 4:14:52 PM

From: Northern NJ, USA

Joined: 2/17/2012

"To avoid latency in theremin response, I can imagine a revisit to machine language?" -- RS Theremin

Possibly, though I'm pretty sure a decent processor running an RTOS would have low enough latency to handle a Theremin.

The reasons I'm looking at processors at this point are to:

1. better utilize the FPGA hardware,

2. make development easier,

3. and revisit old projects that never went anywhere after what seems like many man-years of effort on my part.  Though I kind of get the feeling that I'm throwing more good man-years after bad.  If nothing else, my recent processor reading and ideas have somewhat dislodged me from the pure stack approach.

Posted: 6/16/2012 6:51:59 PM

From: Northern NJ, USA

Joined: 2/17/2012

"4 stacks? That sounds like a lot to me."  -- GordonC

Sorry, I should have been clearer, these would be simple stacks with access only to the top element.

The stack operations would be:

- NOP

- push (write)