From: I R T
Subject: Lisp being used for microprocessor development
Date: 
Message-ID: <acz6wd38.fsf@pop-server.bigpond.net.au>
Andy Glew's cv says that he is actively using C , C # and Lisp.
Since is ( or was until he resigned from AMD) involved with
designing the next generation of mivroprocessors, it looks like
Intel ( he worked there for years and was the architect of the P6)
and AMD use Lisp for chip design.

Lisp Inside ! should be the logo on processors :-)

From: OCID
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <4414486f.0406140930.7385e0@posting.google.com>
I R T <······@bigpond.net.au> wrote in message news:<············@pop-server.bigpond.net.au>...
> Andy Glew's cv says that he is actively using C , C # and Lisp.
> Since is ( or was until he resigned from AMD) involved with
> designing the next generation of mivroprocessors, it looks like
> Intel ( he worked there for years and was the architect of the P6)
> and AMD use Lisp for chip design.
> 
> Lisp Inside ! should be the logo on processors :-)

It is not used in the processors. It is used for chip design
validation. I think Intel uses ML, AMD uses CL, IBM uses Scheme and
Motorola uses something similar. I think two professors from Texas
started the whole Lisp/Scheme thing with processor validation after
Intels famous bug. This is something I have heard from reasonably
reliable people but cannot confirm
From: Thomas Lindgren
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <m3fz8x7t6j.fsf@localhost.localdomain>
···········@yahoo.com (OCID) writes:

> I think two professors from Texas
> started the whole Lisp/Scheme thing with processor validation after
> Intels famous bug. This is something I have heard from reasonably
> reliable people but cannot confirm

You're probably thinking of Boyer and Moore, though their company
(Computational Logic, Inc) seems to have closed.

 "In 1996, CLI's second-generation theorem prover - a sophisticated
 software "assistant" that mechanically checks whether a large-scale
 hardware and software system will actually perform according to its
 specification - helped Advanced Micro Devices verify the
 implementation of floating point division on the AMD K5 Pentium-class
 processor."

<http://www.computationallogic.com/corp/cli-background.html>

Best,
                        Thomas
-- 
Thomas Lindgren
"It's becoming popular? It must be in decline." -- Isaiah Berlin
 
From: Engelke Eschner
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <cakvm9$i9r$05$1@news.t-online.com>
OCID wrote:
> I R T <······@bigpond.net.au> wrote in message news:<············@pop-server.bigpond.net.au>...
> 
>>Andy Glew's cv says that he is actively using C , C # and Lisp.
>>Since is ( or was until he resigned from AMD) involved with
>>designing the next generation of mivroprocessors, it looks like
>>Intel ( he worked there for years and was the architect of the P6)
>>and AMD use Lisp for chip design.
>>
>>Lisp Inside ! should be the logo on processors :-)
> 
> 
> It is not used in the processors. It is used for chip design
> validation. I think Intel uses ML, AMD uses CL, IBM uses Scheme and
> Motorola uses something similar. I think two professors from Texas
> started the whole Lisp/Scheme thing with processor validation after
> Intels famous bug. This is something I have heard from reasonably
> reliable people but cannot confirm

  just google for amd and acl2, you will find a multitude of pages
From: Petter Gustad
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <m3isdtgta6.fsf@scimul.dolphinics.no>
···········@yahoo.com (OCID) writes:

> I R T <······@bigpond.net.au> wrote in message news:<············@pop-server.bigpond.net.au>...
> > Andy Glew's cv says that he is actively using C , C # and Lisp.
> > Since is ( or was until he resigned from AMD) involved with
> > designing the next generation of mivroprocessors, it looks like
> > Intel ( he worked there for years and was the architect of the P6)
> > and AMD use Lisp for chip design.
> > 
> > Lisp Inside ! should be the logo on processors :-)
> 
> It is not used in the processors. It is used for chip design
> validation. I think Intel uses ML, AMD uses CL, IBM uses Scheme and

I guess Lisp is used indirectly in the design process since some
common EDA software from Cadence (if memory serves me right) was
written in Lisp. If you look at some of the EDA file formats like EDIF
(for netlists), and SDF (delay information from chip routers), they
are indeed s-expressions and most likely influenced by Lisp
programmers.

Petter

-- 
A: Because it messes up the order in which people normally read text.
Q: Why is top-posting such a bad thing?
A: Top-posting.
Q: What is the most annoying thing on usenet and in e-mail?
From: Andreas Hinze
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <40cf31db$0$258$4d4ebb8e@read.news.de.uu.net>
Petter Gustad wrote:

> 
> I guess Lisp is used indirectly in the design process since some
> common EDA software from Cadence (if memory serves me right) was
> written in Lisp. 
Cadence developed SKILL for their tools as command/script language.
In fact SKILL was set on top of LISP. There was a paper on ACM that
reports about this:

  	
"SKILL: A CAD system extension language"
Timothy J. Barnes
January 1991 	  	
Proceedings of the 27th ACM/IEEE conference on Design automation

HTH
Regards
AHz
From: William D Clinger
Subject: Re: Lisp being used for microprocessor development
Date: 
Message-ID: <fb74251e.0406141149.678340d1@posting.google.com>
I R T <······@bigpond.net.au> wrote:
> Andy Glew's cv says that he is actively using C , C # and Lisp.
> Since is ( or was until he resigned from AMD) involved with
> designing the next generation of mivroprocessors, it looks like
> Intel ( he worked there for years and was the architect of the P6)
> and AMD use Lisp for chip design.

Probably ACL2, a purely functional subset of Common Lisp that AMD and
now Intel have been using to construct formal proofs of correctness
for major subsystems of their microprocessors.

They've been doing this for years now, starting right after the
Pentium's FDIV bug threw a scare into the AMD designers.  Although
the formal proofs are very tedious and require PhD-level math talent,
they don't require as much project time as simulator-based testing
used to, and they have spotted bugs that probably would have slipped
through simulator- and execution-based testing.

Will