From: Didier Verna
Subject: Writing a specialized web client in Common Lisp
Date: 
Message-ID: <mux1wg881ot.fsf@uzeb.lrde.epita.fr>
       Hello,

I'd like to write an alternative web client for a particular site (in
order to avoid using a web browser). So I went to
http://www.cliki.net/web%20client

Any suggestion on the best choice (including one that would not be
mentionned threre) ?

So far, I naturally started by looking at cl-curl, but it looks
outdated, and I tried Edi's Drakma, but I already have problem with
unparsable cookies expiry formats...

trivial-http will probably by too... trivial, closure is a web browser
so it might be difficult to extract just the part that I need. What
about s-http-client and the client part of cl-http ?

Any hint appreciated !

-- 
Didier Verna, ······@lrde.epita.fr, http://www.lrde.epita.fr/~didier

EPITA / LRDE, 14-16 rue Voltaire   Tel.+33 (1) 44 08 01 85
94276 Le Kremlin-Bic�tre, France   Fax.+33 (1) 53 14 59 22   ······@xemacs.org

From: Edi Weitz
Subject: Re: Writing a specialized web client in Common Lisp
Date: 
Message-ID: <uy7igrpdt.fsf@agharta.de>
On Tue, 19 Jun 2007 16:21:38 +0200, Didier Verna <······@lrde.epita.fr> wrote:

> So far, I naturally started by looking at cl-curl, but it looks
> outdated

Why is it "natural" for a Lisp hacker to look for a C library first?
I don't get it.

> and I tried Edi's Drakma, but I already have problem with unparsable
> cookies expiry formats...

How about sending a bug report?

Edi.

-- 

Lisp is not dead, it just smells funny.

Real email: (replace (subseq ·········@agharta.de" 5) "edi")
From: Didier Verna
Subject: Re: Writing a specialized web client in Common Lisp
Date: 
Message-ID: <muxwsy06m3f.fsf@uzeb.lrde.epita.fr>
Edi Weitz <········@agharta.de> wrote:

> Why is it "natural" for a Lisp hacker to look for a C library first ? I
> don't get it.

  Nothing to do with being a Lisp hacker. I simply knew about curl in
the first place.


>> and I tried Edi's Drakma, but I already have problem with unparsable
>> cookies expiry formats...
>
> How about sending a bug report?

  Will do, don't worry.

-- 
Didier Verna, ······@lrde.epita.fr, http://www.lrde.epita.fr/~didier

EPITA / LRDE, 14-16 rue Voltaire   Tel.+33 (1) 44 08 01 85
94276 Le Kremlin-Bic�tre, France   Fax.+33 (1) 53 14 59 22   ······@xemacs.org
From: Larry Clapp
Subject: Re: Writing a specialized web client in Common Lisp
Date: 
Message-ID: <slrnf7g5sr.ou5.larry@theclapp.ddts.net>
On 2007-06-19, Didier Verna <······@lrde.epita.fr> wrote:
> I'd like to write an alternative web client for a particular site
> (in order to avoid using a web browser). So I went to
> http://www.cliki.net/web%20client
>
> Any suggestion on the best choice (including one that would not be
> mentionned threre) ?

http://www.cliki.net/CL-HTML-Parse might be of some use.  It compiled
out of the box for Lispworks Pro 5.0.2.

-- L
From: Tim X
Subject: Re: Writing a specialized web client in Common Lisp
Date: 
Message-ID: <87odjbrpp1.fsf@lion.rapttech.com.au>
Larry Clapp <·····@theclapp.org> writes:

> On 2007-06-19, Didier Verna <······@lrde.epita.fr> wrote:
>> I'd like to write an alternative web client for a particular site
>> (in order to avoid using a web browser). So I went to
>> http://www.cliki.net/web%20client
>>
>> Any suggestion on the best choice (including one that would not be
>> mentionned threre) ?
>
> http://www.cliki.net/CL-HTML-Parse might be of some use.  It compiled
> out of the box for Lispworks Pro 5.0.2.
>
> -- L
>

I used the aserve client functionality to implement my own podcast aggregator.
It was extremely easy and worked very nicely (though I gave up trying to parse
the RSS XML, not because that was hard, but rather because too many RSS feeds
seem to be broken. Instead, I cheated and used cl-pcre to grab the enclosure
information). The whole thing was only a couple of days work and I think from
memory, less than 100 lines of code. I only stopped using it because I needed
something with bittorrent support. While I had some issues parsing the XML RSS
info, this was more about the XML being badly formed rather than any weakness
with CL or available libraries. 

So, if all you need is a simple client that can retrieve content from a
specific site, cl-aserv is worth looking at. If you combined that with a basic
CL HTML parser, you could probably also display the content fairly easily (as
long as it doesn't have javascript or a complex structure etc). 

Tim

-- 
tcross (at) rapttech dot com dot au