From: Jürgen Böhm
Subject: Lisp on an FPGA
Date: 
Message-ID: <g07osd$su7$00$1@news.t-online.com>
Under the working title "LispmFPGA" I recently completed a first
milestone on the way to an autonomous, "bare metal" Lisp on a CPU
specifically designed to execute compiled Lisp code.

The system on a chip (SoC) comprises

- a Lisp adapted, stack oriented, microprogrammed 32 bit-CPU
(given the preliminary name of E01)
- a character oriented VGA controller
- an SRAM controller
- PS/2 keyboard controller
- serial input/output controller
- some bus logic

All this is implemented with Verilog HDL on a Xilinx Spartan 3 FPGA. The
FPGA board as used now provides in addition to the above features
1MB=256Kx32bit SRAM.

On the software side, there exists

- a compiler, written in Common Lisp, which translates a subset of
Common Lisp into the machine code of the E01-processor. This subset
shall be called Lisp-E01. The compiler-code itself uses mostly only
constructs from Lisp-E01.

- an instruction level emulator of the E01-processor written in Common Lisp.

- a simulator of the full FPGA SoC including interrupts produced by key
presses and correct reaction to writes into the VGA-memory. This is
written in C++/Qt

- a VPI driven interface providing VGA output and key-input to a Verilog
testbed model of the SoC

- a system kernel written purely in Lisp-E01 providing

   - the data types

    cons, string,
		array (1-dim),
		symbol,
		32bit-fixnum,
		template, closure

    and a basic set of manipulation functions
		
   - a stop and copy garbage collector
	
   - functions for keyboard input and elementary output
	
- building on the kernel: A simple text-editor application written in
Lisp-E01
	

Additional information (and some pictures) can be found at

www.aviduratas.de/lisp/lispmfpga/

  Currently there is nearly no documentation, but if, as a result of
this announcement, someone is becoming interested in this project, maybe
wishing to take part in the development, I will write up the necessary
documentation.

  Actually it would be very welcome if someone could take part in the
project, as in a few weeks I will have next to nil time anymore to drive
the project further.

If you ask for possible uses of such a project, the primary answer is
probably:

- educational purposes

including educating oneself (as I did) about Lisp, compiler and
interpreter construction, FPGAs, Verilog, computer architecture,
microprocessor design,

as well as maybe sometime writing an introductory text which provides
introduction to the above topics following the course of the LispmFPGA
project.

- a lisp programming toy

Additionally, as a system like the above (even with more RAM) can be
realized for about 100$, it might become a toy for Lisp enthusiasts and
a learning tool for young people who want to own and program a computer
they can understand and control completely, as was the PET 2001 or C64
in former times.

- a microcontroller programmable in Lisp(????)

This would make a complete overhaul of the CPU necessary, to get a more
compact opcode structure. Also the compiler had to be improved in terms
of time and space efficiency of the generated code. Currently

size(obj.code) ~ 1.8 size(source code)

holds, which is probably to large for embedded applications.


Anyone who is interested in the project is invited to mail me!

Greetings

J�rgen B�hm

-- 
J�rgen B�hm                                            www.aviduratas.de
"At a time when so many scholars in the world are calculating, is it not
desirable that some, who can, dream ?"  R. Thom

From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <482784a6$0$11614$607ed4bc@cv.net>
J�rgen B�hm wrote:
> Under the working title "LispmFPGA" I recently completed a first
> milestone on the way to an autonomous, "bare metal" Lisp on a CPU
> specifically designed to execute compiled Lisp code.
> 
> The system on a chip (SoC) comprises
> 
> - a Lisp adapted, stack oriented, microprogrammed 32 bit-CPU
> (given the preliminary name of E01)
> - a character oriented VGA controller
> - an SRAM controller
> - PS/2 keyboard controller
> - serial input/output controller
> - some bus logic
> 
> All this is implemented with Verilog HDL on a Xilinx Spartan 3 FPGA. The
> FPGA board as used now provides in addition to the above features
> 1MB=256Kx32bit SRAM.
> 
> On the software side, there exists
> 
> - a compiler, written in Common Lisp, which translates a subset of
> Common Lisp into the machine code of the E01-processor. This subset
> shall be called Lisp-E01. The compiler-code itself uses mostly only
> constructs from Lisp-E01.
> 
> - an instruction level emulator of the E01-processor written in Common Lisp.
> 
> - a simulator of the full FPGA SoC including interrupts produced by key
> presses and correct reaction to writes into the VGA-memory. This is
> written in C++/Qt
> 
> - a VPI driven interface providing VGA output and key-input to a Verilog
> testbed model of the SoC
> 
> - a system kernel written purely in Lisp-E01 providing
> 
>    - the data types
> 
>     cons, string,
> 		array (1-dim),
> 		symbol,
> 		32bit-fixnum,
> 		template, closure
> 
>     and a basic set of manipulation functions
> 		
>    - a stop and copy garbage collector
> 	
>    - functions for keyboard input and elementary output
> 	
> - building on the kernel: A simple text-editor application written in
> Lisp-E01
> 	
> 
> Additional information (and some pictures) can be found at
> 
> www.aviduratas.de/lisp/lispmfpga/
> 
>   Currently there is nearly no documentation, but if, as a result of
> this announcement, someone is becoming interested in this project, maybe
> wishing to take part in the development, I will write up the necessary
> documentation.
> 
>   Actually it would be very welcome if someone could take part in the
> project, as in a few weeks I will have next to nil time anymore to drive
> the project further.
> 
> If you ask for possible uses of such a project, the primary answer is
> probably:
> 
> - educational purposes
> 
> including educating oneself (as I did) about Lisp, compiler and
> interpreter construction, FPGAs, Verilog, computer architecture,
> microprocessor design,
> 
> as well as maybe sometime writing an introductory text which provides
> introduction to the above topics following the course of the LispmFPGA
> project.
> 
> - a lisp programming toy
> 
> Additionally, as a system like the above (even with more RAM) can be
> realized for about 100$, it might become a toy for Lisp enthusiasts and
> a learning tool for young people who want to own and program a computer
> they can understand and control completely, as was the PET 2001 or C64
> in former times.
> 
> - a microcontroller programmable in Lisp(????)
> 
> This would make a complete overhaul of the CPU necessary, to get a more
> compact opcode structure. Also the compiler had to be improved in terms
> of time and space efficiency of the generated code. Currently
> 
> size(obj.code) ~ 1.8 size(source code)
> 
> holds, which is probably to large for embedded applications.
> 
> 
> Anyone who is interested in the project is invited to mail me!
> 
> Greetings
> 
> J�rgen B�hm
> 

Wow. :)

Congrats. Looks like it would be good to ride around driving a robot, 
eh? I could use a pet.

kenny

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Rob Warnock
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <GOWdnZWD_MMIJ7rVnZ2dnUVZ_uudnZ2d@speakeasy.net>
Ken Tilton  <·········@gmail.com> wrote:
+---------------
| J�rgen B�hm wrote:
| > Under the working title "LispmFPGA" I recently completed a first
| > milestone on the way to an autonomous, "bare metal" Lisp on a CPU
| > specifically designed to execute compiled Lisp code.
...
| 
| Wow. :)
| 
| Congrats. Looks like it would be good to ride around driving a robot, 
| eh? I could use a pet.
+---------------

Also see:

    http://xkcd.com/413/
    New Pet


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Rob Warnock
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <GOWdnZqD_MOCJLrVnZ2dnUVZ_uudnZ2d@speakeasy.net>
J�rgen B�hm  <······@gmx.net> wrote:
+---------------
| Under the working title "LispmFPGA" I recently completed a first
| milestone on the way to an autonomous, "bare metal" Lisp on a CPU
| specifically designed to execute compiled Lisp code.
+---------------

Neat! Looks like a fun project!

+---------------
| All this is implemented with Verilog HDL on a Xilinx Spartan 3 FPGA.
+---------------

Just curious... Why did you pick the Spartan 3 over other Xilinx FPGAs?
Was it mainly the availability/price of the Starter Kit?

If I were to do something like this myself, I'd probably not bother
with designing a CPU per se but just use some flavor of the Xilinx
Virtex II line which has an on-chip PPC processor (1-4, actually,
depending on model/cost), and then just code a Lisp VM in tight
PPC assembler [but compiled from sexpr-based DSL code, of course!].

+---------------
| The FPGA board as used now provides in addition to the above features
| 1MB=256Kx32bit SRAM.
+---------------

In <http://www.aviduratas.de/lisp/lispmfpga/projectlog.html> you note
that this is an "ISSI" SRAM, but I don't seem to see it in the picture
of the Starter Kit. Is it on the other side of the board?

+---------------
| - a lisp programming toy
| 
| Additionally, as a system like the above (even with more RAM) can be
| realized for about 100$, it might become a toy for Lisp enthusiasts and
| a learning tool for young people who want to own and program a computer
| they can understand and control completely, as was the PET 2001 or C64
| in former times.
+---------------

Again, I might be tempted to look at other platforms that already have
a CPU included, and concentrate my efforts on a tight Lisp VM and/or
integration between the Lisp code and the "custom hardware" that the
FPGA gives you. That is, an ARM or PPC plus an FPGA, or any of a number
of 8- or 16-bit CPUs with built-in hardcoded USB interfaces, e.g., PIC
or Atmel, etc. These pages list a number of such for USB 1.1 and USB 2.0:

    http://www.beyondlogic.org/usb/usbhard.htm
    http://www.beyondlogic.org/usb/usbhard2.htm

Oh, cute! Here's one [found via the above pages] that has your
Xilinx Spartan 3 FPGA on it, along with a Cypress CY68013A FX2LP
USB microcontroller, 64 MiB SDRAM, 1 MiB SRAM (512 KiW x 18),
8 MiB flash, and other goodies:

    http://www.opalkelly.com/products/xem3050/

Oops, price is a bit steep: $750/q1. Oh, well, they also have a smaller
[only 31 MiB SDRAM, no SRAM or flash], cheaper one ["only"(?) $350/q1]:

    http://www.opalkelly.com/products/xem3010/

Or if one prefers Ethernet, something like the PIC18F97J60 family,
which has embedded Ethernet and enough on-chip RAM to run the core
of a Lisp VM (with the user code being in external SRAM, of course).
There are lots of interesting little Ethernet -- and some WiFi --
development boards at reasonable prices here:

    http://www.beyondlogic.org/etherip/ip.htm

Or both, with Atmel's ARM-based AT91RM9200, which has both USB
and 10/100 Ethernet. [Hmmm... Pricing is only $12/q50k [$25/q1?],
but the development board is $5000. Ouch!]

Wow! I didn't realize the array of choices of development boards and
interfaces was so wide. This is going to take some more study...  ;-}  ;-}

+---------------
| - a microcontroller programmable in Lisp(????)
| 
| This would make a complete overhaul of the CPU necessary,
| to get a more compact opcode structure. ...
+---------------

Yeah, for that you might want to look at one of the versions of the
Xilinx Virtex II [mentioned above] which already have embedded CPUs.

But one of your indirect points is still quite correct: It's gotten
quite difficult for the average beginning experimenter to put together
"computers" unless you use some kind of pre-built development board.
(*sigh*)

Anyway, good luck with it...


-Rob

-----
Rob Warnock			<····@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607
From: Jürgen Böhm
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <g09vok$fhe$02$1@news.t-online.com>
Rob Warnock wrote:
> 
> Just curious... Why did you pick the Spartan 3 over other Xilinx FPGAs?
> Was it mainly the availability/price of the Starter Kit?
> 

I was inspired to work with an FPGA as a specialized Lisp-processor in
2006 (or even at the end of 2005) by Frank Buss' Website:

http://www.frank-buss.de/lispcpu/index.html

There he gives the reference to the Spartan 3 Starter Kit board, that he
intended to use.
   After reading about it at the Xilinx site I got the impression, the
board would fit the requirements for a little Lisp SoC. Actually my
knowledge of the FPGA world was small at that time, I did not even
consider the whole world of Altera parts, but with hindsight it was a
very good choice.
     Today I can not even reconstruct how I came to the conclusion, that
the FPGA would be big enough for the project, but possibly I reasoned:
"When Xilinx gets in a demo app their 32 bit Microblaze, a VGA
controller, serial IO and the glue logic on it, it must be big enough
for a Lisp CPU". This was nearly right - but with a narrow margin: The
XC3S200 Spartan 3 provides 1920 "slices" - of which the project uses 1918.

> If I were to do something like this myself, I'd probably not bother
> with designing a CPU per se but just use some flavor of the Xilinx
> Virtex II line which has an on-chip PPC processor (1-4, actually,
> depending on model/cost), and then just code a Lisp VM in tight
> PPC assembler [but compiled from sexpr-based DSL code, of course!].
> 

Of course, one could do it that way, but for me the wish to design a
specific CPU was the starting point of everything. Looking back I can
say that the learning experience was well worth it.

> +---------------
> | The FPGA board as used now provides in addition to the above features
> | 1MB=256Kx32bit SRAM.
> +---------------
> 
> In <http://www.aviduratas.de/lisp/lispmfpga/projectlog.html> you note
> that this is an "ISSI" SRAM, but I don't seem to see it in the picture
> of the Starter Kit. Is it on the other side of the board?
> 

Yes, two small ISSI IS61LV25616AL-10T

> 
> Again, I might be tempted to look at other platforms that already have
> a CPU included, and concentrate my efforts on a tight Lisp VM and/or
> integration between the Lisp code and the "custom hardware" that the
> FPGA gives you. That is, an ARM or PPC plus an FPGA, or any of a number
> of 8- or 16-bit CPUs with built-in hardcoded USB interfaces, e.g., PIC
> or Atmel, etc. These pages list a number of such for USB 1.1 and USB 2.0:
> 
>     http://www.beyondlogic.org/usb/usbhard.htm
>     http://www.beyondlogic.org/usb/usbhard2.htm
> 
> Oh, cute! Here's one [found via the above pages] that has your
> Xilinx Spartan 3 FPGA on it, along with a Cypress CY68013A FX2LP
> USB microcontroller, 64 MiB SDRAM, 1 MiB SRAM (512 KiW x 18),
> 8 MiB flash, and other goodies:
> 
>     http://www.opalkelly.com/products/xem3050/
> 
> Oops, price is a bit steep: $750/q1. Oh, well, they also have a smaller
> [only 31 MiB SDRAM, no SRAM or flash], cheaper one ["only"(?) $350/q1]:
> 
>     http://www.opalkelly.com/products/xem3010/
> 

I found the OpalKelly parts myself through the page

http://www.homebrewcpu.com/magic-16.htm

  which is part of a website describing a computer self-building project
that goes even more to the basics as mine and was also a source of
encouragement and admiration for me.

   But about the OpalKelly boards I think you get not that much value
for that much money. Compare for example:

http://www.nuhorizons.com/xilinx/boards/spartan-3/SP3-1500-2000Board/index.asp

if you insist on a Spartan 3. If you want to use the even better Virtex
4, you get

http://www.xilinx.com/products/devkits/HW-V4-ML401-UNI-G.htm

  which I consider in some sense to be the best buy in the Xilinx world,
if you want to spend more then 200$ and less than 500$ and do not take
software bundling into account. Problems of building a DRAM controller
discussed below apply here also, but in less severe form. Also at the
beginning one can use the 8Mb ZBT-SRAM.

Otherwise, one of the Kits at

http://www.xilinx.com/products/boards/s3_sk_promo.htm

  gives good value for less than 200$. Problem here is that most of
these Kits provide DDR RAM or even DDR2 RAM for which to write a
reliable controller is difficult. Additionally a cache controller
becomes necessary, as the DDR/DDR2 RAM comes with 16bit datapath only on
these boards, so you need a cache with 32 bit datapath to the processor
and burst mode transfers to the RAM.

If you are prepared to say Xilinx farewell and go to Altera you find
this wonderful kit:

http://www.terasic.com.tw/cgi-bin/page/archive.pl?Language=English&CategoryNo=39&No=226

It is not cheap with $599, but with all its features is worth every Cent.

Of course, all the FPGAs on these boards do not contain an embedded
processor and are better suited for the approach of designing the
processor oneself.


> 
> Wow! I didn't realize the array of choices of development boards and
> interfaces was so wide. This is going to take some more study...  ;-}  ;-}

Maybe during the next weeks I will update my published bookmarks, they
will the contain a fairly complete list of available FPGA boards under
700$ that come into question for such a project.

> 
> +---------------
> | - a microcontroller programmable in Lisp(????)
> | 
> | This would make a complete overhaul of the CPU necessary,
> | to get a more compact opcode structure. ...
> +---------------
> 
> Yeah, for that you might want to look at one of the versions of the
> Xilinx Virtex II [mentioned above] which already have embedded CPUs.
> 

  Actually I thought more into the direction of optimizing the opcode
structure. Currently every instruction uses 1 word=32 bit, but with a
better decoding stage the opcode size could be halved or even divided by
4 for many opcodes, getting an opcode structure where two or four
opcodes are packed into a machine word. This would reduce code size and
the number of memory accesses for opcode fetch.
  Additionally one could provide more combined instructions by using a
larger microcode memory. For example the combination

LITIDX d
SYMBOL-FUNCTION

appears quite often in the code (using 8 Bytes). By providing a single

LIT-IDX-SYM-FUN d

  one could save 4 Bytes here. A statistical analysis of the generated
code should lead to further ideas for such concatenations. (As far as I
know, the designers of the CLISP virtual machine used this optimization).

  Generally in the microcontroller world, a stack based CPU seems to be
still a valid approach, a lot of embedded programming is done in Forth,
there are special Forth chips and there are still new ideas for making
stack processors more efficient.

> But one of your indirect points is still quite correct: It's gotten
> quite difficult for the average beginning experimenter to put together
> "computers" unless you use some kind of pre-built development board.
> (*sigh*)
> 

but see the pointer above (..homebrewcpu..) and also:

http://www.cc86.org/~dkuschel/

> Anyway, good luck with it...
> 
> 

Thank you, best thing I can hope that it might become interesting for
others too.

- J�rgen

-- 
J�rgen B�hm                                            www.aviduratas.de
"At a time when so many scholars in the world are calculating, is it not
desirable that some, who can, dream ?"  R. Thom
From: Frank Buss
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <6u3tdsnls3u8.10eju87goejt1$.dlg@40tude.net>
J�rgen B�hm wrote:

> Maybe during the next weeks I will update my published bookmarks, they
> will the contain a fairly complete list of available FPGA boards under
> 700$ that come into question for such a project.

Don't forget to add this page, if not already in your bookmarks:

http://www.fpga-faq.com/FPGA_Boards.shtml

>   Generally in the microcontroller world, a stack based CPU seems to be
> still a valid approach, a lot of embedded programming is done in Forth,
> there are special Forth chips and there are still new ideas for making
> stack processors more efficient.

I think this is true and it was one reason why I first tried to learn some
Forth before continuing with the Lisp CPU. Now I'm programming an embedded
system in Forth and planning to do more in this language, so the Lisp CPU
is postponed. I've even started to create a Forth CPU:

http://www.frank-buss.de/forth/cpu1/

Every opcode is 16 bit long. Because every function (called "words" in
Forth) of a typical Forth program calls many other words, there is one
dedicated bit to make an opcode a call (which limits the callable memory to
32kB, but this is no problem for most embedded systems). In theory every
opcode needs two or three clock cycles, but I was a bit lazy with BRAM read
access, so the current version needs 4 to 6 clock cycles. Works great at
50MHz.

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Frank Buss
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <1h5fjfpn1vl99$.157j1nxsoh6xg.dlg@40tude.net>
J�rgen B�hm wrote:

> Under the working title "LispmFPGA" I recently completed a first
> milestone on the way to an autonomous, "bare metal" Lisp on a CPU
> specifically designed to execute compiled Lisp code.

Nice work. Good to see that one of my many unfinished projects at least was
some inspiration for this project.

> - a simulator of the full FPGA SoC including interrupts produced by key
> presses and correct reaction to writes into the VGA-memory. This is
> written in C++/Qt

A Lisp CPU simulator written in C++ ?

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <4828bef9$0$15171$607ed4bc@cv.net>
Frank Buss wrote:
> J�rgen B�hm wrote:
> 
> 
>>Under the working title "LispmFPGA" I recently completed a first
>>milestone on the way to an autonomous, "bare metal" Lisp on a CPU
>>specifically designed to execute compiled Lisp code.
> 
> 
> Nice work. Good to see that one of my many unfinished projects at least was
> some inspiration for this project.
> 
> 
>>- a simulator of the full FPGA SoC including interrupts produced by key
>>presses and correct reaction to writes into the VGA-memory. This is
>>written in C++/Qt
> 
> 
> A Lisp CPU simulator written in C++ ?
> 

He didn't know about Cells-Gtk. :)

You know, one of the things I noticed about Cells just recently is what 
it does. Took long enough, eh? Anyway, I noticed that Cells is about 
change. Cells give the programmer a hook into any change, and more fun, 
Cells gives prgram state causal power over other causal state. ie, One 
thing can change another. So Cells is about the most fundamental concept 
one can have in modelling, which is what programmers do.

The natural consequence of this insight was the realization that 
something like Cells should have been the first thing we did in 
programming, and maybe even should have been built into the hardware, 
not that I know what I am talking about. But now that we have Lisp so 
close to the iron....?

kenny


-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Frank Buss
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <81aeko1w52wx$.17u3g6cq9u0wu$.dlg@40tude.net>
Ken Tilton wrote:

> You know, one of the things I noticed about Cells just recently is what 
> it does. Took long enough, eh? Anyway, I noticed that Cells is about 
> change. Cells give the programmer a hook into any change, and more fun, 
> Cells gives prgram state causal power over other causal state. ie, One 
> thing can change another. So Cells is about the most fundamental concept 
> one can have in modelling, which is what programmers do.

Yes, dataflow concepts are very interesting. I've bought this book on dead
trees:

http://www.jpaulmorrison.com/fbp/book.pdf

Last week it arrived, now I'm reading it, when I go by train. I hope to
learn something more about the basic concepts. Cells is a pragmatic
implementation, but I would like to know more about the theory behind it.

> The natural consequence of this insight was the realization that 
> something like Cells should have been the first thing we did in 
> programming, and maybe even should have been built into the hardware, 
> not that I know what I am talking about. But now that we have Lisp so 
> close to the iron....?

Computer hardware, and schematics in general, are very much dataflow
oriented: You have many hardwired gates, like AND, NAND, flip-flops etc.,
and with every clock pulse the bits flow through the gates. When
programming in VHDL, the "program" describes such a dataflow architecture,
too. VHDL is very redundant, but it is some kind of declarative language (
http://en.wikipedia.org/wiki/Declarative_programming ), which produces a
gate configuration. You describe blocks of gates, called entities, with
inputs and outputs, instantiate the entities and connect the inputs and
outputs, same like in Cells.

I don't know if it is possible to build a general dataflow hardware. With
newer FPGA chips it is possible to reconfigure parts of the gates
on-the-fly at runtime, so it is possible to "compile" a dataflow program
immediatly to a hardware configuration. This would be nice: Think of
millions of interacting cells, where all cells are updated in one clock
cycle in parallel. This would be very similar to the human brain.

But if cells are the basic building blocks of a hardware, I don't know if
this works well for all problems. Things like GUI, pattern recognition etc.
are good for dataflow languages, but I think that things like imperative
algorithms, e.g. quick sort, would look more complicated with dataflow
languages. Maybe a network of many hardware cells and some conventional CPU
cores on one die would be a good idea.

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <4828fe25$0$11633$607ed4bc@cv.net>
Frank Buss wrote:
> Ken Tilton wrote:
> 
> 
>>You know, one of the things I noticed about Cells just recently is what 
>>it does. Took long enough, eh? Anyway, I noticed that Cells is about 
>>change. Cells give the programmer a hook into any change, and more fun, 
>>Cells gives prgram state causal power over other causal state. ie, One 
>>thing can change another. So Cells is about the most fundamental concept 
>>one can have in modelling, which is what programmers do.
> 
> 
> Yes, dataflow concepts are very interesting. I've bought this book on dead
> trees:
> 
> http://www.jpaulmorrison.com/fbp/book.pdf
> 
> Last week it arrived, now I'm reading it, when I go by train. I hope to
> learn something more about the basic concepts. Cells is a pragmatic
> implementation, but I would like to know more about the theory behind it.

Paul Tarvydas just turned me on to that prior art, but I did not know it 
was on-line. I have to draw myself a few fingers of single malt have a 
read.

My Google skills must be deficient, I never saw that one, and it is 
about as close as one can get. Lessee, Morrison writes: "...a set of 
concepts which, based on my experience over the last 30 years, I think 
really does provide a quantum jump in improving application development 
productivity."

What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
imagination. A couple of them blogged their disappointment that I 
focused on the tired old Cells story at ECLM. Meanwhile exactly one hand 
in ninety-six went up when I asked how many of them used Cells, and that 
person is a Cells committer!

Let's see what else my fellow delusionist has to say:

"While this technology has been in use for productive work for the last 
20 years, it has also been waiting in the wings, so to speak, for its 
right time to come on stage. Perhaps because there is a "paradigm
shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
widely known up to now, but I believe now is the time to open it up to a 
wider public."

Bad news, old sport, 2008 and the yobbos still don't get it. Apparently 
they need /documentation/. Well, OpenLaszlo gets it, FRP gets it, 
Flapjax and Adobe Adam get it...

"The time is now ripe for a new paradigm to replace the von Neumann 
model as the bridging model between hardware and software."

1994. Trying to popularize something in use for twenty years. But in 
2008 Mr. Rhodes describes Cells as "the dataflow paradigm that Kenny 
believes is central to all simple applications".

    http://www.advogato.org/person/crhodes/diary.html?start=125

Er, can we at least dismiss me acccurately? "...that Kenny believes is 
central to simplifying development of complex applications." Thanks.

It's going to be very embarrassing for everyone when they have to 
subscribe to cells-devel.... nah, they'll write their own before they 
would do that. :)

> 
> 
>>The natural consequence of this insight was the realization that 
>>something like Cells should have been the first thing we did in 
>>programming, and maybe even should have been built into the hardware, 
>>not that I know what I am talking about. But now that we have Lisp so 
>>close to the iron....?
> 
> 
> Computer hardware, and schematics in general, are very much dataflow
> oriented: You have many hardwired gates, like AND, NAND, flip-flops etc.,
> and with every clock pulse the bits flow through the gates. When
> programming in VHDL, the "program" describes such a dataflow architecture,
> too. VHDL is very redundant, but it is some kind of declarative language (
> http://en.wikipedia.org/wiki/Declarative_programming ), which produces a
> gate configuration. You describe blocks of gates, called entities, with
> inputs and outputs, instantiate the entities and connect the inputs and
> outputs, same like in Cells.
> 
> I don't know if it is possible to build a general dataflow hardware. With
> newer FPGA chips it is possible to reconfigure parts of the gates
> on-the-fly at runtime, so it is possible to "compile" a dataflow program
> immediatly to a hardware configuration. This would be nice: Think of
> millions of interacting cells, where all cells are updated in one clock
> cycle in parallel. This would be very similar to the human brain.
> 
> But if cells are the basic building blocks of a hardware, I don't know if
> this works well for all problems.

Oh, no, Tilton's Law is clear on this: "All X all the time is always wrong."

> Things like GUI, pattern recognition etc.
> are good for dataflow languages, but I think that things like imperative
> algorithms, e.g. quick sort, would look more complicated with dataflow
> languages.

Absolutely, never let the paradigm tail wag the development dog. 
Whatever that means.

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <48290369$0$11597$607ed4bc@cv.net>
Ken Tilton wrote:
> 
> 
> Frank Buss wrote:
> 
>> Ken Tilton wrote:
>>
>>
>>> You know, one of the things I noticed about Cells just recently is 
>>> what it does. Took long enough, eh? Anyway, I noticed that Cells is 
>>> about change. Cells give the programmer a hook into any change, and 
>>> more fun, Cells gives prgram state causal power over other causal 
>>> state. ie, One thing can change another. So Cells is about the most 
>>> fundamental concept one can have in modelling, which is what 
>>> programmers do.
>>
>>
>>
>> Yes, dataflow concepts are very interesting. I've bought this book on 
>> dead
>> trees:
>>
>> http://www.jpaulmorrison.com/fbp/book.pdf
>>
>> Last week it arrived, now I'm reading it, when I go by train. I hope to
>> learn something more about the basic concepts. Cells is a pragmatic
>> implementation, but I would like to know more about the theory behind it.
> 
> 
> Paul Tarvydas just turned me on to that prior art, but I did not know it 
> was on-line. I have to draw myself a few fingers of single malt have a 
> read.
> 
> My Google skills must be deficient, I never saw that one, and it is 
> about as close as one can get. Lessee, Morrison writes: "...a set of 
> concepts which, based on my experience over the last 30 years, I think 
> really does provide a quantum jump in improving application development 
> productivity."
> 
> What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
> imagination. A couple of them blogged their disappointment that I 
> focused on the tired old Cells story at ECLM. Meanwhile exactly one hand 
> in ninety-six went up when I asked how many of them used Cells, and that 
> person is a Cells committer!
> 
> Let's see what else my fellow delusionist has to say:
> 
> "While this technology has been in use for productive work for the last 
> 20 years, it has also been waiting in the wings, so to speak, for its 
> right time to come on stage. Perhaps because there is a "paradigm
> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
> widely known up to now, but I believe now is the time to open it up to a 
> wider public."
> 
> Bad news, old sport, 2008 and the yobbos still don't get it. Apparently 
> they need /documentation/. Well, OpenLaszlo gets it, FRP gets it, 
> Flapjax and Adobe Adam get it...
> 
> "The time is now ripe for a new paradigm to replace the von Neumann 
> model as the bridging model between hardware and software."

This is scary. Just a couple of weeks ago I was thinking I might do 
better taking precisely this angle in shilling Cells:

"This feeling that the subject matter of this book is fun is one of the
most common reactions we have encountered, and is one of the main things 
which makes my collaborators and myself believe that we have stumbled on 
something important."

Word.

kzo

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <482904c5$0$11606$607ed4bc@cv.net>
>>
>> http://www.jpaulmorrison.com/fbp/book.pdf
>>
> 
> My Google skills must be deficient, I never saw that one, and it is 
> about as close as one can get. Lessee, Morrison writes: "...a set of 
> concepts which, based on my experience over the last 30 years, I think 
> really does provide a quantum jump in improving application development 
> productivity."
> 
> What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
> imagination. A couple of them blogged their disappointment that I 
> focused on the tired old Cells story at ECLM. Meanwhile exactly one hand 
> in ninety-six went up when I asked how many of them used Cells, and that 
> person is a Cells committer!
> 
> Let's see what else my fellow delusionist has to say:
> 
> "While this technology has been in use for productive work for the last 
> 20 years, it has also been waiting in the wings, so to speak, for its 
> right time to come on stage. Perhaps because there is a "paradigm
> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
> widely known up to now, but I believe now is the time to open it up to a 
> wider public."
> 
> Bad news, old sport, 2008 and the yobbos still don't get it. Apparently 
> they need /documentation/. Well, OpenLaszlo gets it, FRP gets it, 
> Flapjax and Adobe Adam get it...
> 
> "The time is now ripe for a new paradigm to replace the von Neumann 
> model as the bridging model between hardware and software."

"It forces developers to focus on data and its transformations, rather 
than starting with procedural code. It encourages rapid prototyping and 
results in more reliable, more maintainable systems. It is compatible 
with distributed systems, and appears to be on a convergent path with 
Object-Oriented Programming. [See Cells. ed]

"Does it sound too good to be true? You be the judge! In the following
pages, we will be describing what I believe is a genuine revolution in 
the process of creating application programs to support the data 
processing requirements of companies around the world."

Raving lunatic!

"In this book, I will describe the concepts underlying this technology 
and give examples of experience gained using it."

Cool, now I do not have to document Cells.

kzo

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Pascal Costanza
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <68t0v7F2un9asU2@mid.individual.net>
Ken Tilton wrote:
> 
>>>
>>> http://www.jpaulmorrison.com/fbp/book.pdf
>>>
>>
>> My Google skills must be deficient, I never saw that one, and it is 
>> about as close as one can get. Lessee, Morrison writes: "...a set of 
>> concepts which, based on my experience over the last 30 years, I think 
>> really does provide a quantum jump in improving application 
>> development productivity."
>>
>> What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
>> imagination. A couple of them blogged their disappointment that I 
>> focused on the tired old Cells story at ECLM. Meanwhile exactly one 
>> hand in ninety-six went up when I asked how many of them used Cells, 
>> and that person is a Cells committer!
>>
>> Let's see what else my fellow delusionist has to say:
>>
>> "While this technology has been in use for productive work for the 
>> last 20 years, it has also been waiting in the wings, so to speak, for 
>> its right time to come on stage. Perhaps because there is a "paradigm
>> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
>> widely known up to now, but I believe now is the time to open it up to 
>> a wider public."
>>
>> Bad news, old sport, 2008 and the yobbos still don't get it. 
>> Apparently they need /documentation/. Well, OpenLaszlo gets it, FRP 
>> gets it, Flapjax and Adobe Adam get it...
>>
>> "The time is now ripe for a new paradigm to replace the von Neumann 
>> model as the bridging model between hardware and software."
> 
> "It forces developers to focus on data and its transformations, rather 
> than starting with procedural code. It encourages rapid prototyping and 
> results in more reliable, more maintainable systems. It is compatible 
> with distributed systems, and appears to be on a convergent path with 
> Object-Oriented Programming. [See Cells. ed]
> 
> "Does it sound too good to be true? You be the judge! In the following
> pages, we will be describing what I believe is a genuine revolution in 
> the process of creating application programs to support the data 
> processing requirements of companies around the world."
> 
> Raving lunatic!
> 
> "In this book, I will describe the concepts underlying this technology 
> and give examples of experience gained using it."

Does it talk about efficiency concerns? Cyclic dependencies? 
Dependencies between collections?


Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <4829a835$0$15171$607ed4bc@cv.net>
Pascal Costanza wrote:
> Ken Tilton wrote:
> 
>>
>>>>
>>>> http://www.jpaulmorrison.com/fbp/book.pdf
>>>>
>>>
>>> My Google skills must be deficient, I never saw that one, and it is 
>>> about as close as one can get. Lessee, Morrison writes: "...a set of 
>>> concepts which, based on my experience over the last 30 years, I 
>>> think really does provide a quantum jump in improving application 
>>> development productivity."
>>>
>>> What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
>>> imagination. A couple of them blogged their disappointment that I 
>>> focused on the tired old Cells story at ECLM. Meanwhile exactly one 
>>> hand in ninety-six went up when I asked how many of them used Cells, 
>>> and that person is a Cells committer!
>>>
>>> Let's see what else my fellow delusionist has to say:
>>>
>>> "While this technology has been in use for productive work for the 
>>> last 20 years, it has also been waiting in the wings, so to speak, 
>>> for its right time to come on stage. Perhaps because there is a 
>>> "paradigm
>>> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
>>> widely known up to now, but I believe now is the time to open it up 
>>> to a wider public."
>>>
>>> Bad news, old sport, 2008 and the yobbos still don't get it. 
>>> Apparently they need /documentation/. Well, OpenLaszlo gets it, FRP 
>>> gets it, Flapjax and Adobe Adam get it...
>>>
>>> "The time is now ripe for a new paradigm to replace the von Neumann 
>>> model as the bridging model between hardware and software."
>>
>>
>> "It forces developers to focus on data and its transformations, rather 
>> than starting with procedural code. It encourages rapid prototyping 
>> and results in more reliable, more maintainable systems. It is 
>> compatible with distributed systems, and appears to be on a convergent 
>> path with Object-Oriented Programming. [See Cells. ed]
>>
>> "Does it sound too good to be true? You be the judge! In the following
>> pages, we will be describing what I believe is a genuine revolution in 
>> the process of creating application programs to support the data 
>> processing requirements of companies around the world."
>>
>> Raving lunatic!
>>
>> "In this book, I will describe the concepts underlying this technology 
>> and give examples of experience gained using it."
> 
> 
> Does it talk about efficiency concerns? Cyclic dependencies? 
> Dependencies between collections?

251pp, no TOC, no index, one heckuva bibliography, link still above: 
knock yourself out. (I stopped reading, I have work to do.)

The concerns raised are all tool-level, nothing about Actually Building 
Applications. Why am I not surprised? Morrison, presumably unaware of my 
ECLM (kinda) beach rants would respond:

"By the way, I should state at the outset that my focus is not on 
mathematical applications, but on business applications - the former is 
a different ball-game, and one happily played by academics all over the 
world. Business applications are different, and much of my work has been 
to try to determine exactly why they should be so different, and what we 
can do to solve the problem of building and maintaining them."

Maybe this is why I am the only one who likes Cells (being the only one 
besides Edi who writes applications). The "why" business apps are 
different is covered by Brooks in his "No Silver Bullet Essays". 
Something like a sort algorithm can be fascinating while being concerned 
with exactly one bit of state: the sort key. Business applications -- I 
guess it would be like trying to perform a delicate chemistry experiment 
on the A-train in NYC.

Morrison also offers these precise statistics I am sure would pass any 
peer review: "a set of concepts which, based on my experience over the 
last 30 years, I think really does provide a quantum jump in improving 
application development productivity". In other words, this report is 
coming from people in the field doing Actual Work and the improvement we 
  experience is astonishing and exciting and has us doing our best to 
share it with people apparently no more open to new ideas than our 
latest troll, Natty Dread.

The problem for the yobbos is that they now have to call me and Morrison 
liars -- well, hang on, they have Mastenbrook for that. :)

Something I already quoted from Morrison: "While this technology has 
been in use for productive work for the last 20 years...". See? You 
really do have to call us liars, we have been having great fun and 
success with them for decades (and telling you about it for as long).

I thought of a good example this AM. When I did Celtk, starting from the 
LTK codebase, it was obvious that Cells observers could be used to shunt 
information form the Lisp world to the Tk world. In a few days I was 
supporting more of Tk than the mature LTk, which required hand-coding of 
the necesarry plumbing. So LTk "needed Cells", it just did not know it. 
And that does not even get into the cool thing about Cells and GUIs, 
which is linking the components (vs linking Lisp to Tcl). But back to my 
point, it /is/ a paradigm shift. An obvious one to me, since I write 
applications, but maybe other folks need to learn it before they will 
understand its applicability.

One quick note on the efficiency concern: well, the system automatically 
identifies the exact dependency graph (dynamically changing to suit new 
circumstances) so we always update all and (wait for it) only those 
things that need to be updated as the world turns, with no burden on the 
application programmer. It's like GC over manual memory management, only 
better, because determining the dependency graph in most cases is a 
one-time charge.

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <4829ba76$0$25034$607ed4bc@cv.net>
Ken Tilton wrote:
> 
> 
> Pascal Costanza wrote:
> 
>> Ken Tilton wrote:
>>
>>>
>>>>>
>>>>> http://www.jpaulmorrison.com/fbp/book.pdf
>>>>>
>>>>
>>>> My Google skills must be deficient, I never saw that one, and it is 
>>>> about as close as one can get. Lessee, Morrison writes: "...a set of 
>>>> concepts which, based on my experience over the last 30 years, I 
>>>> think really does provide a quantum jump in improving application 
>>>> development productivity."
>>>>
>>>> What a nut job! Any #lisp yobbo can tell you Cells is a figment of 
>>>> my imagination. A couple of them blogged their disappointment that I 
>>>> focused on the tired old Cells story at ECLM. Meanwhile exactly one 
>>>> hand in ninety-six went up when I asked how many of them used Cells, 
>>>> and that person is a Cells committer!
>>>>
>>>> Let's see what else my fellow delusionist has to say:
>>>>
>>>> "While this technology has been in use for productive work for the 
>>>> last 20 years, it has also been waiting in the wings, so to speak, 
>>>> for its right time to come on stage. Perhaps because there is a 
>>>> "paradigm
>>>> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
>>>> widely known up to now, but I believe now is the time to open it up 
>>>> to a wider public."
>>>>
>>>> Bad news, old sport, 2008 and the yobbos still don't get it. 
>>>> Apparently they need /documentation/. Well, OpenLaszlo gets it, FRP 
>>>> gets it, Flapjax and Adobe Adam get it...
>>>>
>>>> "The time is now ripe for a new paradigm to replace the von Neumann 
>>>> model as the bridging model between hardware and software."
>>>
>>>
>>>
>>> "It forces developers to focus on data and its transformations, 
>>> rather than starting with procedural code. It encourages rapid 
>>> prototyping and results in more reliable, more maintainable systems. 
>>> It is compatible with distributed systems, and appears to be on a 
>>> convergent path with Object-Oriented Programming. [See Cells. ed]
>>>
>>> "Does it sound too good to be true? You be the judge! In the following
>>> pages, we will be describing what I believe is a genuine revolution 
>>> in the process of creating application programs to support the data 
>>> processing requirements of companies around the world."
>>>
>>> Raving lunatic!
>>>
>>> "In this book, I will describe the concepts underlying this 
>>> technology and give examples of experience gained using it."
>>
>>
>>
>> Does it talk about efficiency concerns? Cyclic dependencies? 
>> Dependencies between collections?
> 
> 
> 251pp, no TOC, no index, one heckuva bibliography, link still above: 
> knock yourself out. (I stopped reading, I have work to do.)
> 
> The concerns raised are all tool-level, nothing about Actually Building 
> Applications. Why am I not surprised? Morrison, presumably unaware of my 
> ECLM (kinda) beach rants would respond:
> 
> "By the way, I should state at the outset that my focus is not on 
> mathematical applications, but on business applications - the former is 
> a different ball-game, and one happily played by academics all over the 
> world. Business applications are different, and much of my work has been 
> to try to determine exactly why they should be so different, and what we 
> can do to solve the problem of building and maintaining them."
> 
> Maybe this is why I am the only one who likes Cells (being the only one 
> besides Edi who writes applications). The "why" business apps are 
> different is covered by Brooks in his "No Silver Bullet Essays". 
> Something like a sort algorithm can be fascinating while being concerned 
> with exactly one bit of state: the sort key. Business applications -- I 
> guess it would be like trying to perform a delicate chemistry experiment 
> on the A-train in NYC.
> 
> Morrison also offers these precise statistics I am sure would pass any 
> peer review: "a set of concepts which, based on my experience over the 
> last 30 years, I think really does provide a quantum jump in improving 
> application development productivity". In other words, this report is 
> coming from people in the field doing Actual Work and the improvement we 
>  experience is astonishing and exciting and has us doing our best to 
> share it with people apparently no more open to new ideas than our 
> latest troll, Natty Dread.
> 
> The problem for the yobbos is that they now have to call me and Morrison 
> liars -- well, hang on, they have Mastenbrook for that. :)
> 
> Something I already quoted from Morrison: "While this technology has 
> been in use for productive work for the last 20 years...". See? You 
> really do have to call us liars, we have been having great fun and 
> success with them for decades (and telling you about it for as long).
> 
> I thought of a good example this AM. When I did Celtk, starting from the 
> LTK codebase, it was obvious that Cells observers could be used to shunt 
> information form the Lisp world to the Tk world. In a few days I was 
> supporting more of Tk than the mature LTk, which required hand-coding of 
> the necesarry plumbing. So LTk "needed Cells", it just did not know it. 
> And that does not even get into the cool thing about Cells and GUIs, 
> which is linking the components (vs linking Lisp to Tcl). But back to my 
> point, it /is/ a paradigm shift. An obvious one to me, since I write 
> applications, but maybe other folks need to learn it before they will 
> understand its applicability.
> 
> One quick note on the efficiency concern: well, the system automatically 
> identifies the exact dependency graph (dynamically changing to suit new 
> circumstances) so we always update all and (wait for it) only those 
> things that need to be updated as the world turns, with no burden on the 
> application programmer. It's like GC over manual memory management, only 
> better, because determining the dependency graph in most cases is a 
> one-time charge.

Oh, and a note on cyclic dependencies. I plan to do something along 
those lines when one of my applications needs it unlike the other many 
and varied applications I have developed over the past thirteen years. 
(Hint.)

Looks easy to do something primitive, btw, but recall that Steele et al 
tried supporting all these elaborate forms of interdependence not 
because they needed them but because they had thought of them and came 
to grief in what I have christened The Bridge Too Far scenario.

We have the concept of premature optimization, how about premature 
powerfication? Everything the constraints crowd tried (partial 
constraints, multi-way constraints) was a fun idea but hell to program, 
and look where it got them: Steele retreated to Scheme and then to Jave 
and now to Fortran.

I did a bit of the same early on with GUI geometry, offering ways for a 
widget to calculate a value if one was not specified. No right? Do we 
have left and width? That led to weeks of pain and suffering, then I 
realized I never needed it. In any particular case of GUI authoring I 
always knew how to calculate all the bounds. If I knew the width, I 
simply specified either the right as left plus width or the left as 
right minus width. If I knew neither left nor right I did not know what 
I as doing, and if I knew left, right, and width they better agree and 
then I can shut up about the width.

I may as well touch on the last: collections? That is just engineering. 
Peter Hildebrandt is playing with something in re hashtables, and I have 
done so-called "aggregate" cells in the past. I guess object databases 
and RDF triple stores, count as well. As I said, it is just a question 
of engineering -- one has to take the state one has in mind and create 
the glue that lets the Cells engine manage it.

kzo

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Pascal Costanza
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <68u6fsF2uss7vU1@mid.individual.net>
Ken Tilton wrote:
> 
> 
> Ken Tilton wrote:
>>
>>
>> Pascal Costanza wrote:
>>
>>> Ken Tilton wrote:
>>>
>>>>
>>>>>>
>>>>>> http://www.jpaulmorrison.com/fbp/book.pdf
>>>>>>
>>>>>
>>>>> My Google skills must be deficient, I never saw that one, and it is 
>>>>> about as close as one can get. Lessee, Morrison writes: "...a set 
>>>>> of concepts which, based on my experience over the last 30 years, I 
>>>>> think really does provide a quantum jump in improving application 
>>>>> development productivity."
>>>>>
>>>>> What a nut job! Any #lisp yobbo can tell you Cells is a figment of 
>>>>> my imagination. A couple of them blogged their disappointment that 
>>>>> I focused on the tired old Cells story at ECLM. Meanwhile exactly 
>>>>> one hand in ninety-six went up when I asked how many of them used 
>>>>> Cells, and that person is a Cells committer!
>>>>>
>>>>> Let's see what else my fellow delusionist has to say:
>>>>>
>>>>> "While this technology has been in use for productive work for the 
>>>>> last 20 years, it has also been waiting in the wings, so to speak, 
>>>>> for its right time to come on stage. Perhaps because there is a 
>>>>> "paradigm
>>>>> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
>>>>> widely known up to now, but I believe now is the time to open it up 
>>>>> to a wider public."
>>>>>
>>>>> Bad news, old sport, 2008 and the yobbos still don't get it. 
>>>>> Apparently they need /documentation/. Well, OpenLaszlo gets it, FRP 
>>>>> gets it, Flapjax and Adobe Adam get it...
>>>>>
>>>>> "The time is now ripe for a new paradigm to replace the von Neumann 
>>>>> model as the bridging model between hardware and software."
>>>>
>>>>
>>>>
>>>> "It forces developers to focus on data and its transformations, 
>>>> rather than starting with procedural code. It encourages rapid 
>>>> prototyping and results in more reliable, more maintainable systems. 
>>>> It is compatible with distributed systems, and appears to be on a 
>>>> convergent path with Object-Oriented Programming. [See Cells. ed]
>>>>
>>>> "Does it sound too good to be true? You be the judge! In the following
>>>> pages, we will be describing what I believe is a genuine revolution 
>>>> in the process of creating application programs to support the data 
>>>> processing requirements of companies around the world."
>>>>
>>>> Raving lunatic!
>>>>
>>>> "In this book, I will describe the concepts underlying this 
>>>> technology and give examples of experience gained using it."
>>>
>>>
>>>
>>> Does it talk about efficiency concerns? Cyclic dependencies? 
>>> Dependencies between collections?
>>
>>
>> 251pp, no TOC, no index, one heckuva bibliography, link still above: 
>> knock yourself out. (I stopped reading, I have work to do.)
>>
>> The concerns raised are all tool-level, nothing about Actually 
>> Building Applications. Why am I not surprised? Morrison, presumably 
>> unaware of my ECLM (kinda) beach rants would respond:
>>
>> "By the way, I should state at the outset that my focus is not on 
>> mathematical applications, but on business applications - the former 
>> is a different ball-game, and one happily played by academics all over 
>> the world. Business applications are different, and much of my work 
>> has been to try to determine exactly why they should be so different, 
>> and what we can do to solve the problem of building and maintaining 
>> them."
>>
>> Maybe this is why I am the only one who likes Cells (being the only 
>> one besides Edi who writes applications). The "why" business apps are 
>> different is covered by Brooks in his "No Silver Bullet Essays". 
>> Something like a sort algorithm can be fascinating while being 
>> concerned with exactly one bit of state: the sort key. Business 
>> applications -- I guess it would be like trying to perform a delicate 
>> chemistry experiment on the A-train in NYC.
>>
>> Morrison also offers these precise statistics I am sure would pass any 
>> peer review: "a set of concepts which, based on my experience over the 
>> last 30 years, I think really does provide a quantum jump in improving 
>> application development productivity". In other words, this report is 
>> coming from people in the field doing Actual Work and the improvement 
>> we  experience is astonishing and exciting and has us doing our best 
>> to share it with people apparently no more open to new ideas than our 
>> latest troll, Natty Dread.
>>
>> The problem for the yobbos is that they now have to call me and 
>> Morrison liars -- well, hang on, they have Mastenbrook for that. :)
>>
>> Something I already quoted from Morrison: "While this technology has 
>> been in use for productive work for the last 20 years...". See? You 
>> really do have to call us liars, we have been having great fun and 
>> success with them for decades (and telling you about it for as long).
>>
>> I thought of a good example this AM. When I did Celtk, starting from 
>> the LTK codebase, it was obvious that Cells observers could be used to 
>> shunt information form the Lisp world to the Tk world. In a few days I 
>> was supporting more of Tk than the mature LTk, which required 
>> hand-coding of the necesarry plumbing. So LTk "needed Cells", it just 
>> did not know it. And that does not even get into the cool thing about 
>> Cells and GUIs, which is linking the components (vs linking Lisp to 
>> Tcl). But back to my point, it /is/ a paradigm shift. An obvious one 
>> to me, since I write applications, but maybe other folks need to learn 
>> it before they will understand its applicability.
>>
>> One quick note on the efficiency concern: well, the system 
>> automatically identifies the exact dependency graph (dynamically 
>> changing to suit new circumstances) so we always update all and (wait 
>> for it) only those things that need to be updated as the world turns, 
>> with no burden on the application programmer. It's like GC over manual 
>> memory management, only better, because determining the dependency 
>> graph in most cases is a one-time charge.
> 
> Oh, and a note on cyclic dependencies. I plan to do something along 
> those lines when one of my applications needs it unlike the other many 
> and varied applications I have developed over the past thirteen years. 
> (Hint.)
> 
> Looks easy to do something primitive, btw, but recall that Steele et al 
> tried supporting all these elaborate forms of interdependence not 
> because they needed them but because they had thought of them and came 
> to grief in what I have christened The Bridge Too Far scenario.
> 
> We have the concept of premature optimization, how about premature 
> powerfication? Everything the constraints crowd tried (partial 
> constraints, multi-way constraints) was a fun idea but hell to program, 
> and look where it got them: Steele retreated to Scheme and then to Jave 
> and now to Fortran.
> 
> I did a bit of the same early on with GUI geometry, offering ways for a 
> widget to calculate a value if one was not specified. No right? Do we 
> have left and width? That led to weeks of pain and suffering, then I 
> realized I never needed it. In any particular case of GUI authoring I 
> always knew how to calculate all the bounds. If I knew the width, I 
> simply specified either the right as left plus width or the left as 
> right minus width. If I knew neither left nor right I did not know what 
> I as doing, and if I knew left, right, and width they better agree and 
> then I can shut up about the width.
> 
> I may as well touch on the last: collections? That is just engineering. 
> Peter Hildebrandt is playing with something in re hashtables, and I have 
> done so-called "aggregate" cells in the past. I guess object databases 
> and RDF triple stores, count as well. As I said, it is just a question 
> of engineering -- one has to take the state one has in mind and create 
> the glue that lets the Cells engine manage it.

This already sounds a lot more balanced than "it's the best thing since 
sliced bread and solves all problems for you." ;)

Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <482b40c5$0$15188$607ed4bc@cv.net>
Pascal Costanza wrote:
> Ken Tilton wrote:
> 
>>
>>
>> Ken Tilton wrote:
>>
>>>
>>>
>>> Pascal Costanza wrote:
>>>
>>>> Ken Tilton wrote:
>>>>
>>>>>
>>>>>>>
>>>>>>> http://www.jpaulmorrison.com/fbp/book.pdf
>>>>>>>
>>>>>>
>>>>>> My Google skills must be deficient, I never saw that one, and it 
>>>>>> is about as close as one can get. Lessee, Morrison writes: "...a 
>>>>>> set of concepts which, based on my experience over the last 30 
>>>>>> years, I think really does provide a quantum jump in improving 
>>>>>> application development productivity."
>>>>>>
>>>>>> What a nut job! Any #lisp yobbo can tell you Cells is a figment of 
>>>>>> my imagination. A couple of them blogged their disappointment that 
>>>>>> I focused on the tired old Cells story at ECLM. Meanwhile exactly 
>>>>>> one hand in ninety-six went up when I asked how many of them used 
>>>>>> Cells, and that person is a Cells committer!
>>>>>>
>>>>>> Let's see what else my fellow delusionist has to say:
>>>>>>
>>>>>> "While this technology has been in use for productive work for the 
>>>>>> last 20 years, it has also been waiting in the wings, so to speak, 
>>>>>> for its right time to come on stage. Perhaps because there is a 
>>>>>> "paradigm
>>>>>> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
>>>>>> widely known up to now, but I believe now is the time to open it 
>>>>>> up to a wider public."
>>>>>>
>>>>>> Bad news, old sport, 2008 and the yobbos still don't get it. 
>>>>>> Apparently they need /documentation/. Well, OpenLaszlo gets it, 
>>>>>> FRP gets it, Flapjax and Adobe Adam get it...
>>>>>>
>>>>>> "The time is now ripe for a new paradigm to replace the von 
>>>>>> Neumann model as the bridging model between hardware and software."
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> "It forces developers to focus on data and its transformations, 
>>>>> rather than starting with procedural code. It encourages rapid 
>>>>> prototyping and results in more reliable, more maintainable 
>>>>> systems. It is compatible with distributed systems, and appears to 
>>>>> be on a convergent path with Object-Oriented Programming. [See 
>>>>> Cells. ed]
>>>>>
>>>>> "Does it sound too good to be true? You be the judge! In the following
>>>>> pages, we will be describing what I believe is a genuine revolution 
>>>>> in the process of creating application programs to support the data 
>>>>> processing requirements of companies around the world."
>>>>>
>>>>> Raving lunatic!
>>>>>
>>>>> "In this book, I will describe the concepts underlying this 
>>>>> technology and give examples of experience gained using it."
>>>>
>>>>
>>>>
>>>>
>>>> Does it talk about efficiency concerns? Cyclic dependencies? 
>>>> Dependencies between collections?
>>>
>>>
>>>
>>> 251pp, no TOC, no index, one heckuva bibliography, link still above: 
>>> knock yourself out. (I stopped reading, I have work to do.)
>>>
>>> The concerns raised are all tool-level, nothing about Actually 
>>> Building Applications. Why am I not surprised? Morrison, presumably 
>>> unaware of my ECLM (kinda) beach rants would respond:
>>>
>>> "By the way, I should state at the outset that my focus is not on 
>>> mathematical applications, but on business applications - the former 
>>> is a different ball-game, and one happily played by academics all 
>>> over the world. Business applications are different, and much of my 
>>> work has been to try to determine exactly why they should be so 
>>> different, and what we can do to solve the problem of building and 
>>> maintaining them."
>>>
>>> Maybe this is why I am the only one who likes Cells (being the only 
>>> one besides Edi who writes applications). The "why" business apps are 
>>> different is covered by Brooks in his "No Silver Bullet Essays". 
>>> Something like a sort algorithm can be fascinating while being 
>>> concerned with exactly one bit of state: the sort key. Business 
>>> applications -- I guess it would be like trying to perform a delicate 
>>> chemistry experiment on the A-train in NYC.
>>>
>>> Morrison also offers these precise statistics I am sure would pass 
>>> any peer review: "a set of concepts which, based on my experience 
>>> over the last 30 years, I think really does provide a quantum jump in 
>>> improving application development productivity". In other words, this 
>>> report is coming from people in the field doing Actual Work and the 
>>> improvement we  experience is astonishing and exciting and has us 
>>> doing our best to share it with people apparently no more open to new 
>>> ideas than our latest troll, Natty Dread.
>>>
>>> The problem for the yobbos is that they now have to call me and 
>>> Morrison liars -- well, hang on, they have Mastenbrook for that. :)
>>>
>>> Something I already quoted from Morrison: "While this technology has 
>>> been in use for productive work for the last 20 years...". See? You 
>>> really do have to call us liars, we have been having great fun and 
>>> success with them for decades (and telling you about it for as long).
>>>
>>> I thought of a good example this AM. When I did Celtk, starting from 
>>> the LTK codebase, it was obvious that Cells observers could be used 
>>> to shunt information form the Lisp world to the Tk world. In a few 
>>> days I was supporting more of Tk than the mature LTk, which required 
>>> hand-coding of the necesarry plumbing. So LTk "needed Cells", it just 
>>> did not know it. And that does not even get into the cool thing about 
>>> Cells and GUIs, which is linking the components (vs linking Lisp to 
>>> Tcl). But back to my point, it /is/ a paradigm shift. An obvious one 
>>> to me, since I write applications, but maybe other folks need to 
>>> learn it before they will understand its applicability.
>>>
>>> One quick note on the efficiency concern: well, the system 
>>> automatically identifies the exact dependency graph (dynamically 
>>> changing to suit new circumstances) so we always update all and (wait 
>>> for it) only those things that need to be updated as the world turns, 
>>> with no burden on the application programmer. It's like GC over 
>>> manual memory management, only better, because determining the 
>>> dependency graph in most cases is a one-time charge.
>>
>>
>> Oh, and a note on cyclic dependencies. I plan to do something along 
>> those lines when one of my applications needs it unlike the other many 
>> and varied applications I have developed over the past thirteen years. 
>> (Hint.)
>>
>> Looks easy to do something primitive, btw, but recall that Steele et 
>> al tried supporting all these elaborate forms of interdependence not 
>> because they needed them but because they had thought of them and came 
>> to grief in what I have christened The Bridge Too Far scenario.
>>
>> We have the concept of premature optimization, how about premature 
>> powerfication? Everything the constraints crowd tried (partial 
>> constraints, multi-way constraints) was a fun idea but hell to 
>> program, and look where it got them: Steele retreated to Scheme and 
>> then to Jave and now to Fortran.
>>
>> I did a bit of the same early on with GUI geometry, offering ways for 
>> a widget to calculate a value if one was not specified. No right? Do 
>> we have left and width? That led to weeks of pain and suffering, then 
>> I realized I never needed it. In any particular case of GUI authoring 
>> I always knew how to calculate all the bounds. If I knew the width, I 
>> simply specified either the right as left plus width or the left as 
>> right minus width. If I knew neither left nor right I did not know 
>> what I as doing, and if I knew left, right, and width they better 
>> agree and then I can shut up about the width.
>>
>> I may as well touch on the last: collections? That is just 
>> engineering. Peter Hildebrandt is playing with something in re 
>> hashtables, and I have done so-called "aggregate" cells in the past. I 
>> guess object databases and RDF triple stores, count as well. As I 
>> said, it is just a question of engineering -- one has to take the 
>> state one has in mind and create the glue that lets the Cells engine 
>> manage it.
> 
> 
> This already sounds a lot more balanced than "it's the best thing since 
> sliced bread and solves all problems for you." ;)

But balanced is the last thing one should hear from someone who has 
discovered something extraordinary. Case in point:

    http://wiki.alu.org:80/RtL_Highlight_Film

As PG said, when good programmers like something, don't argue, download.

Funny thing is that Cells is so great it even makes research into 
programming possible, naturally manifesting the complexity of an 
application as defined by Brooks in NSB as the number of kinds of state 
and their interdependence.

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Pascal Costanza
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <692ck3F2uec38U1@mid.individual.net>
Ken Tilton wrote:
> 
> As PG said, when good programmers like something, don't argue, download.

Yes, I am doing that. :-D


Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <482c124c$0$15188$607ed4bc@cv.net>
Pascal Costanza wrote:
> Ken Tilton wrote:
> 
>>
>> As PG said, when good programmers like something, don't argue, download.
> 
> 
> Yes, I am doing that. :-D

You'll need an application. :)

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Pascal Costanza
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <692ipvF2vm79oU1@mid.individual.net>
Ken Tilton wrote:
> 
> Pascal Costanza wrote:
>> Ken Tilton wrote:
>>
>>>
>>> As PG said, when good programmers like something, don't argue, download.
>>
>>
>> Yes, I am doing that. :-D
> 
> You'll need an application. :)

What a totally surprising and unexpected insight.


Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <482c54b9$0$11617$607ed4bc@cv.net>
Pascal Costanza wrote:
> Ken Tilton wrote:
> 
>>
>> Pascal Costanza wrote:
>>
>>> Ken Tilton wrote:
>>>
>>>>
>>>> As PG said, when good programmers like something, don't argue, 
>>>> download.
>>>
>>>
>>>
>>> Yes, I am doing that. :-D
>>
>>
>> You'll need an application. :)
> 
> 
> What a totally surprising and unexpected insight.

Actually, you have my compassion. [I'm trying, Edi. I'm really trying. 
<g>] How does one come up with a test application that is realistic, 
defined as not having been conceived just to test something?

I guess the easiest would be to build a GUI for something sensible and 
get a running start from Celtk or Cells-Gtk. If Peter H has made any 
headway there might be a Cells-ODE out there, you can do a pinball game.

RoboCells was a great test for Cells and indeed forced the evolution of 
Cells3. The bad news is that last I looked I had managed to lose the 
core layer. Quite a feat given that I have generations of development 
source trees all over every hard drive I own, in multiplicity.

People have told me Cells is kinda like KR. I have been tempted in the 
past to find out how much and try building a LISA-like atop Cells.

The Grail of course nowadays is something like OpenAIR: Ajax with Cells 
Inside(tm). The Cells archive over the past month will reveal fanciful 
ruminations by moiself on how far that could be taken.

Just in case: the Fibonacci calculation was a joke. I said as much 
(twice), but someone complained afterwards that it seemed like a 
monstrous way to ... reminds me of the War of the Worlds panic, 
apparently they stated repeatedly during the show that it was fiction. 
Funny how that works.

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Frank Buss
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <l80mew1g2aci.1j090zhta0hwj$.dlg@40tude.net>
Pascal Costanza wrote:

> Does it talk about efficiency concerns? Cyclic dependencies? 
> Dependencies between collections?

The real book has an index and bibliography, and there is a chapter called
"Performance Considerations". It has also chapters about loops, trees,
synchronization and checkpoints and something called "related compile
theory concepts" and of course many more chapters related to application
programming. I can say more the next weeks, when I'm finished with the
book.

-- 
Frank Buss, ··@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
From: Pascal Costanza
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <68u6hfF2uss7vU2@mid.individual.net>
Frank Buss wrote:
> Pascal Costanza wrote:
> 
>> Does it talk about efficiency concerns? Cyclic dependencies? 
>> Dependencies between collections?
> 
> The real book has an index and bibliography, and there is a chapter called
> "Performance Considerations". It has also chapters about loops, trees,
> synchronization and checkpoints and something called "related compile
> theory concepts" and of course many more chapters related to application
> programming. I can say more the next weeks, when I'm finished with the
> book.

OK, sounds interesting.


Pascal

-- 
1st European Lisp Symposium (ELS'08)
http://prog.vub.ac.be/~pcostanza/els08/

My website: http://p-cos.net
Common Lisp Document Repository: http://cdr.eurolisp.org
Closer to MOP & ContextL: http://common-lisp.net/project/closer/
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <48291770$0$15164$607ed4bc@cv.net>
Ken Tilton wrote:
> 
> 
> Frank Buss wrote:
> 
>> Ken Tilton wrote:
>>
>>
>>> You know, one of the things I noticed about Cells just recently is 
>>> what it does. Took long enough, eh? Anyway, I noticed that Cells is 
>>> about change. Cells give the programmer a hook into any change, and 
>>> more fun, Cells gives prgram state causal power over other causal 
>>> state. ie, One thing can change another. So Cells is about the most 
>>> fundamental concept one can have in modelling, which is what 
>>> programmers do.
>>
>>
>>
>> Yes, dataflow concepts are very interesting. I've bought this book on 
>> dead
>> trees:
>>
>> http://www.jpaulmorrison.com/fbp/book.pdf
>>
>> Last week it arrived, now I'm reading it, when I go by train. I hope to
>> learn something more about the basic concepts. Cells is a pragmatic
>> implementation, but I would like to know more about the theory behind it.
> 
> 
> Paul Tarvydas just turned me on to that prior art, but I did not know it 
> was on-line. I have to draw myself a few fingers of single malt have a 
> read.
> 
> My Google skills must be deficient, I never saw that one, and it is 
> about as close as one can get. Lessee, Morrison writes: "...a set of 
> concepts which, based on my experience over the last 30 years, I think 
> really does provide a quantum jump in improving application development 
> productivity."
> 
> What a nut job! Any #lisp yobbo can tell you Cells is a figment of my 
> imagination. A couple of them blogged their disappointment that I 
> focused on the tired old Cells story at ECLM. Meanwhile exactly one hand 
> in ninety-six went up when I asked how many of them used Cells, and that 
> person is a Cells committer!
> 
> Let's see what else my fellow delusionist has to say:
> 
> "While this technology has been in use for productive work for the last 
> 20 years, it has also been waiting in the wings, so to speak, for its 
> right time to come on stage. Perhaps because there is a "paradigm
> shift" involved, to use Kuhn's phrase (Kuhn 1970), it has not been 
> widely known up to now, but I believe now is the time to open it up to a 
> wider public."
> 
> Bad news, old sport, 2008 and the yobbos still don't get it. Apparently 
> they need /documentation/. Well, OpenLaszlo gets it, FRP gets it, 
> Flapjax and Adobe Adam get it...
> 
> "The time is now ripe for a new paradigm to replace the von Neumann 
> model as the bridging model between hardware and software."
> 
> 1994. Trying to popularize something in use for twenty years.

...and in 2002, /still/ trying to drum up interest in the Perl 
community: http://www.perlmonks.org/?node_id=148111

This really is funny. :)

kt

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Pertti Kellomäki
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <g0bfgl$1ho$1@news.cc.tut.fi>
Frank Buss wrote:
> This would be nice: Think of
> millions of interacting cells, where all cells are updated in one clock
> cycle in parallel. This would be very similar to the human brain.

If you want to go that way, forget the clock. Asynchronous hardware
is much more interesting.
-- 
Pertti
From: Ken Tilton
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <4829b30e$0$11615$607ed4bc@cv.net>
Pertti Kellom�ki wrote:
> Frank Buss wrote:
> 
>> This would be nice: Think of
>> millions of interacting cells, where all cells are updated in one clock
>> cycle in parallel. This would be very similar to the human brain.
> 
> 
> If you want to go that way, forget the clock. Asynchronous hardware
> is much more interesting.

C'mon, guys! I went to a lot of trouble in Cells-3 to make sure nothing 
got calculated out of order! Now we have everything running 
helter-skelter any time they feel like it!

That actually went OK for quite a while -- things eventually converged 
on a correct value... maybe the trick is to let that be the default and 
then in the odd case where latency might cause an unintended missile 
strike on the yobbos we would hesitate but then do the right thing and 
enforce some synchronicity lock....

kxo

-- 
http://smuglispweeny.blogspot.com/
http://www.theoryyalgebra.com/
ECLM rant: 
http://video.google.com/videoplay?docid=-1331906677993764413&hl=en
ECLM talk: 
http://video.google.com/videoplay?docid=-9173722505157942928&q=&hl=en
From: Petter Gustad
Subject: Re: Lisp on an FPGA
Date: 
Message-ID: <87d4nocce6.fsf@pangea.home.gustad.com>
J�rgen B�hm <······@gmx.net> writes:

> Under the working title "LispmFPGA" I recently completed a first
> milestone on the way to an autonomous, "bare metal" Lisp on a CPU
> specifically designed to execute compiled Lisp code.

Great job! Looks like lots of fun. 

> - a Lisp adapted, stack oriented, microprogrammed 32 bit-CPU

I noticed that you had a parser for your microcode assembler. Why not
use Common Lisp? I once wrote a microcode assembler generator. It
would extract the specification from the Verilog HDL (defines to
specify opcodes and bit fields within instructons). It would then
generate the micro-assembler on-the-fly and run it. The microcode
assembly was then using Lisp syntax and took advantage of all the
features of Common Lisp. It was pretty short too, only a couple
hundred lines long.

> All this is implemented with Verilog HDL on a Xilinx Spartan 3 FPGA. The

I think it would be great fun to design a HDL based upon Common Lisp
and write a simulator and synthesis tool (which would transfor the
CL-HDL into EDIF) for it, presumably a parallel version to cut down
the synthesis time for the next generation FPGA's (i.e. 1M LE's).

Good luck on your project!

Petter
-- 
A: Because it messes up the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?