dslinux/user/wget/util Makefile.in README dist-wget download-netscape.html download.html rmold.pl wget.spec

amadeus dslinux_amadeus at user.in-berlin.de
Thu Aug 31 11:32:46 CEST 2006


Update of /cvsroot/dslinux/dslinux/user/wget/util
In directory antilope:/tmp/cvs-serv14346/user/wget/util

Added Files:
	Makefile.in README dist-wget download-netscape.html 
	download.html rmold.pl wget.spec 
Log Message:
Add some more applications

--- NEW FILE: download.html ---
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML//EN">
<html>
  <head>
    <title>Wget Gateway</title>
    <link rev="made" href="mailto:Antonio.Rosella at agip.it">
  </head>

  <body>
      <h1>Wget Gateway</h1>
      <p>
	Welcome to Wget Gateway, a simple page showing the usage of
	socksified wget behind a firewall. In my configuration it is
	very useful because:
      <ul>
	<li>Only few users can exit from firewall
	<li>A lot of users need information that can be reached in Internet
	<li>I cannot dowload big files during my job time, so, I
	  have to schedule the requests after the normal work time
      </ul>

      <p>
	With the combination of a socksified wget and a simple cgi
	that schedules the requests can I reach the aim. All you need
	is:
      <ul>
	<li> A socksified copy of
	  <a href="ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/wget.tar.gz">
	    wget</a>
	<li> Perl (available on all the GNU mirroring sites)
	<li> cgi-lib.pl (available at
	  <a href="ftp://ftp.switch.ch/mirror/CPAN/ROADMAP.html">CPAN</a>)
	<li> A customized copy of this html
	<li> A customized copy of socks.cgi
      </ul>
      This is my h/s configuration:
      <pre>

+----------+ +----------------------------------+ +---------------------+
| Firewall | | Host that can exit from firewall | | Intranet www server |
+----------+ |            htceff                | +---------------------+
             +----------------------------------+ | Wget.html           |
             | socksified wget                  | +---------------------+
	     | cgi-lib.pl                       | 
	     | perl                             |
	     | wget.cgi                         |
	     +----------------------------------+
      </pre>
      <p>
	wget.cgi, wget and cgi-lib.pl are located in the usual
	cgi-bin directory. The customization of wget.cgi and
	wget.html has to reflect you installation, i.e.:
      <ul>
	<li> download.html requires wget.cgi
	<li> wget.cgi requires Perl, cgi-lib.pl and wget
	<li>
	  wget.cgi has to download the files to a directory writable
	  by the user submitting the request.  At the moment I have an
	  anonymous ftp installed on <em>htceff</em>, and wget puts
	  dowloaded files to /pub/incoming directory (if you look at
	  wget.cgi, it sets the destdir to "/u/ftp/pub/incoming" if
	  the user leaves it blank).
      </ul>
      <p>
	You can also add other parameters that you want to pass to wget,
	but in this case you will also have to modify wget.cgi

      <hr>
      <form method="get" action="http://localhost/cgi-bin/wget.cgi">
	<h3>Downloading (optionally recursive)</h3>
	<ul>
	  <li>
	    Recursion:
	    <Select name=Recursion>
	      <Option selected value=N>No</Option>
	      <Option value=Y>Yes</Option>
	    </Select>
	  <li>
	    Depth:
	    <input type="radio" name=depth value=1 checked>1
	    <input type="radio" name=depth value=2 >2
	    <input type="radio" name=depth value=3 >3 
	    <input type="radio" name=depth value=4 >4
	    <input type="radio" name=depth value=5 >5
	  <li>
	    Url to download: <input name="url" size=50>
	  <li>
	    Destination directory: <input name="destdir" size=50>
	</ul>
	Now you can <input type="submit" value="download"> the
	requested URL or <input type="reset" value="reset"> the form.
      </form>
      <hr>
      Feedback is always useful! Please contact me at
      <address>
	<a href="mailto:Antonio.Rosella at agip.it">Antonio Rosella&lt;Antonio.Rosella at agip.it&gt;</a>.
      </address>
      You can send your suggestions or bug reports for Wget to
      <address>
	<a href="mailto:hniksic at arsdigita.com">Hrvoje Niksic &lt;hniksic at arsdigita.com&gt;</a>.
      </address>
      <!-- hhmts start -->
Last modified: October 23, 2000
<!-- hhmts end -->
  </body>
</html>


--- NEW FILE: README ---
                                                           -*- text -*-

This directory contains various optional utilities to help you use
Wget.


Socks:
======
Antonio Rosella <antonio.rosella at agip.it> has written a sample HTML
frontend and a Perl script to demonstrate usage of socksified Wget as
web retriever.

To configure Wget to use socks, do a
$ ./configure --with-sox.

download.html and download-netscape.html are examples of how you can
use socksified Wget to schedule the WWW requests.  wget.cgi is a
CGI Perl script used in conjunction with download.html, which
schedules request using the "at" command.

To get the script, contact Antonino.

rmold.pl
========
This Perl script is used to check which local files are no longer on
the remote server.  You can use it to get the list of files, or
$ rmold.pl [dir] | xargs rm


--- NEW FILE: Makefile.in ---
# Makefile for `wget' utility
# Copyright (C) 1995, 1996, 1997 Free Software Foundation, Inc.

# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.

# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.

# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.

# In addition, as a special exception, the Free Software Foundation
# gives permission to link the code of its release of Wget with the
# OpenSSL project's "OpenSSL" library (or with modified versions of it
# that use the same license as the "OpenSSL" library), and distribute
# the linked executables.  You must obey the GNU General Public License
# in all respects for all of the code used other than "OpenSSL".  If you
# modify this file, you may extend this exception to your version of the
# file, but you are not obligated to do so.  If you do not wish to do
# so, delete this exception statement from your version.

#
# Version: @VERSION@
#

SHELL = /bin/sh

top_builddir = ..

srcdir = @srcdir@
VPATH  = @srcdir@

RM = rm -f

all:

clean:

distclean: clean
	$(RM) Makefile

realclean: distclean


--- NEW FILE: rmold.pl ---
#! /usr/bin/perl -w

# Copyright (C) 1995, 1996, 1997 Free Software Foundation, Inc.

# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.

# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.

# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.


# This script is a very lame hack to remove local files, until the
# time when Wget proper will have this functionality.
# Use with utmost care!

# If the remote server supports BSD-style listings, set this to 0.
$sysvlisting = 1;

$verbose = 0;

if (@ARGV && ($ARGV[0] eq '-v')) {
    shift;
    $verbose = 1;
}

(@dirs = @ARGV) || push (@dirs,'.');


foreach $_ (@dirs) {
    &procdir($_);
}

# End here

sub procdir
{
    local $dir = shift;
    local(@lcfiles, @lcdirs, %files, @fl);

    print STDERR "Processing directory '$dir':\n" if $verbose;
    
    opendir(DH, $dir) || die("Cannot open $dir: $!\n");
    @lcfiles = ();
    @lcdirs = ();
    # Read local files and directories.
    foreach $_ (readdir(DH)) {
        /^(\.listing|\.\.?)$/ && next;
        lstat ("$dir/$_");
        if (-d _) {
            push (@lcdirs, $_);
        }
        else {
            push (@lcfiles, $_);
        }
    }
    closedir(DH);
    # Parse .listing
    if (open(FD, "<$dir/.listing")) {
        while (<FD>)
        {
            # Weed out the line beginning with 'total'
            /^total/ && next;
            # Weed out everything but plain files and symlinks.
            /^[-l]/ || next;
            @fl = split;
            $files{$fl[7 + $sysvlisting]} = 1;
        }
        close FD;
        foreach $_ (@lcfiles) {
            if (!$files{$_}) {
                print "$dir/$_\n";
            }
        }
    }
    else {
        print STDERR "Warning: $dir/.listing: $!\n";
    }
    foreach $_ (@lcdirs) {
        &procdir("$dir/$_");
    }
}


--- NEW FILE: download-netscape.html ---
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
<html>
  <head>
    <title>Wget Gateway</title>
    <link rev="made" href="mailto:Antonio.Rosella at agip.it">
  </head>
  
  <body>
    <center>
      <h1>Wget Gateway</h1>
    </center>
    <p>
      Welcome to Wget Gateway, a simple page showing the usage of
      socksified wget behind a firewall. In my configuration it is
      very useful because:
    <ul>
      <li>Only few users can exit from firewall
      <li>A lot of users need information that can be reached in Internet
      <li>I cannot dowload big files during my job time, so, I
	have to schedule the requests after the normal work time
    </ul>

    <p>
      With the combination of a socksified wget and a simple cgi
      that schedules the requests can I reach the aim. All you need
      is:
    <ul>
      <li> A socksified copy of
	<a href="ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/wget.tar.gz">
	  wget</a>
      <li> Perl (available on all the GNU mirroring sites)
      <li> cgi-lib.pl (available at
	<a href="ftp://ftp.switch.ch/mirror/CPAN/ROADMAP.html">CPAN</a>)
      <li> A customized copy of this html
      <li> A customized copy of socks.cgi
    </ul>
    This is my h/s configuration:
    <pre>

+----------+ +----------------------------------+ +---------------------+
| Firewall | | Host that can exit from firewall | | Intranet www server |
+----------+ |            htceff                | +---------------------+
             +----------------------------------+ | Wget.html           |
             | socksified wget                  | +---------------------+
	     | cgi-lib.pl                       | 
	     | perl                             |
	     | wget.cgi                         |
	     +----------------------------------+
    </pre>
    <p>
      wget.cgi, wget and cgi-lib.pl are located in the usual
      cgi-bin directory. The customization of wget.cgi and
      wget.html has to reflect you installation, i.e.:
    <ul>
      <li> download-netscape.html requires wget.cgi
      <li> wget.cgi requires Perl, cgi-lib.pl and wget
      <li>
	wget.cgi has to download the files to a directory writable
	by the user submitting the request.  At the moment I have an
	anonymous ftp installed on <em>htceff</em>, and wget puts
	dowloaded files to /pub/incoming directory (if you look at
	wget.cgi, it sets the destdir to "/u/ftp/pub/incoming" if
	the user leaves it blank).
    </ul>
    <p>
      You can also add other parameters that you want to pass to wget,
      but in this case you will also have to modify wget.cgi

    <hr>
    <form method="get" action="http://localhost/cgi-bin/wget.cgi">
      <center>
	<table border=1>
	    <td>Recursive Download
	    <td><select name=Recursion>
		<Option selected value=N>No</Option>
		<Option value=Y>Yes</Option>
	      </select>
	</table>
	<hr>
	<table border=1>
	    <td>Depth
	    <td><input type="radio" name=depth value=1 checked> 1
	    <td><input type="radio" name=depth value=2 > 2
	    <td><input type="radio" name=depth value=3 > 3 
	    <td><input type="radio" name=depth value=4 > 4
	    <td><input type="radio" name=depth value=5 > 5
	</table>
	<hr>
	<table>
	    <td>Url to download: <td><input name="url" size=50><TR>
	    <td>Destination directory: <td><input name="destdir" size=50><TR>
	</table>
	<hr>
	Now you can
	<font color=yellow><input type="submit"	value="download"></font>
	the requested URL or
	<font color=yellow><input type="reset" value="reset"></font>
	the form.
    </form>
    <hr>
    Feedback is always useful! Please contact me at
    <address>
      <a href="mailto:Antonio.Rosella at agip.it">Antonio Rosella&lt;Antonio.Rosella at agip.it&gt;</a>.
    </address>
    You can send your suggestions or bug reports for Wget to
    <address>
      <a href="mailto:hniksic at arsdigita.com">Hrvoje Niksic &lt;hniksic at arsdigita.com&gt;</a>.
    </address>
    <!-- hhmts start -->
Last modified: Mon Oct 23 17:40:03 CEST 2000
<!-- hhmts end -->
  </body>
</html>


--- NEW FILE: dist-wget ---
#!/bin/sh

# Copyright (C) 2001 Free Software Foundation, Inc.

# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 2 of the License, or
# (at your option) any later version.

# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.  See the
# GNU General Public License for more details.

# You should have received a copy of the GNU General Public License
# along with this program; if not, write to the Free Software
# Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.

# In addition, as a special exception, the Free Software Foundation
# gives permission to link the code of its release of Wget with the
# OpenSSL project's "OpenSSL" library (or with modified versions of it
# that use the same license as the "OpenSSL" library), and distribute
# the linked executables.  You must obey the GNU General Public License
# in all respects for all of the code used other than "OpenSSL".  If you
# modify this file, you may extend this exception to your version of the
# file, but you are not obligated to do so.  If you do not wish to do
# so, delete this exception statement from your version.

##
#
# This script creates a Wget distribution (wget-VERSION.tar.gz).
# It uses `make dist' to do most of the work, but corrects some
# things that `make dist' doesn't and can't do.  Specifically:
#
# * Checks out the clean CVS source from the repository to a temporary
#   directory.
# * Runs autoconf, configure and `make' in the doc and po subdirs to
#   make sure that all the generated files, such as `confifure',
#   `wget.info', and translated PO files, end up in the distribution.
# * Optionally changes src/version.c and doc/version.texi to the
#   version forced by `--force-version'.
# * Runs `make dist' to produce the archive.
# * Removes the checkout.
#
# For example, to produce a Wget beta based on the latest CVS sources,
# with version "1.23-beta10", run `dist-wget --force-version 1.23-beta10'.
# You can choose which sources will be used by specifying `-D DATE'
# or `-r TAG'.
#
##

set -e

CVSROOT=:pserver:cvs at sunsite.dk:/pack/anoncvs
SUBDIR=wget.cvs.$$
DEBUG=no

EXPORT_TAG='-r HEAD'
VERSION=
MAKE=${MAKE-make}

if test x"$TMPDIR" = x
then
  TMPDIR=/tmp
fi
DEST_DIR=`pwd`

while test x"$*" != x
do
  case "$1" in
    -d)
      DEBUG=yes
      ;;
    -D)
      shift
      EXPORT_TAG="-D $1"
      ;;
    -r)
      shift
      EXPORT_TAG="-r $1"
      ;;
    --force-version)
      shift
      VERSION=$1
      ;;
    *)
      echo "Usage: $0 [-d] [-r TAG | -D DATE]" >&2
      exit 1
  esac
  shift
done

# Resolve echo -n incompatibilities.
e_n=-n
e_c=
if test x"`(echo -n foo; echo bar)`" != xfoobar; then
  e_n=
  e_c='\c'
fi

# File for output/errors redirection.
O=$DEST_DIR/dist-output

cd $TMPDIR

echo "Building wget dist in $TMPDIR/$SUBDIR."
echo "Output from commands is in $O."

echo "-----------" >$O

# Checkout clean sources from the repository.
echo $e_n "Exporting ($EXPORT_TAG) out the CVS tree to $TMPDIR/$SUBDIR... $e_c"
cvs -d $CVSROOT export $EXPORT_TAG -d $SUBDIR wget 1>>$O 2>&1
echo "done."

cd $SUBDIR

# Remove the dummy `Branches' directory.
rm -rf Branches 1>>$O 2>&1

# Force the version if required.
if test x"$VERSION" != x
then
  echo "Forcing version to $VERSION."
  echo "char *version_string = \"$VERSION\";" > src/version.c
  echo "@set VERSION $VERSION" > doc/version.texi
fi

# Create configure and friends.
if test ! -f configure; then
  echo $e_n "Creating \`configure' from \`configure.in'... $e_c"
  $MAKE -f Makefile.cvs 1>>$O 2>&1
  echo "done."
fi

# Remove `Makefile' if it already exists.
if test -f Makefile; then
  echo $e_n "Cleaning old Makefiles with \`$MAKE distclean'... $e_c"
  $MAKE distclean 1>>$O 2>&1
  echo "done."
fi

# Create a new `Makefile'.
echo $e_n "Running configure... $e_c"
CFLAGS=-g ./configure 1>>$O 2>&1
echo "done."

# Now build the MO files.
echo $e_n "Building MO files out of PO files... $e_c"
cd po
$MAKE 1>>$O 2>&1
cd ..
echo "done."

# Now build the Info documentation and the man page.
echo $e_n "Building Info and man documentation... $e_c"
cd doc
$MAKE 1>>$O 2>&1
cd ..
echo "done."

# Create the distribution file.
echo $e_n "Creating distribution tarball... $e_c"
$MAKE dist 1>>$O 2>&1
archive=`echo wget-*.tar.gz`
mv "$archive" $DEST_DIR
echo "$archive"

cd ..

if test $DEBUG = no; then
  rm -rf $SUBDIR 1>>$O 2>&1
fi

--- NEW FILE: wget.spec ---
Name: wget
Version: 1.7
Release: 1
Copyright: GPL
Source: ftp://ftp.gnu.org/gnu/wget/wget-%{version}.tar.gz 
Url: http://sunsite.dk/wget/
Provides: webclient
Prereq: /sbin/install-info
BuildRoot: /var/tmp/%{name}-root

Group: Applications/Internet
Group(cs): Aplikace/Internet
Summary: A utility for retrieving files using the HTTP or FTP protocols.
Summary(cs): Nástroj pro stahování souborù pomocí protokolù HTTP nebo FTP.

%description
GNU Wget is a free network utility to retrieve files from the World
Wide Web using HTTP and FTP protocols. It works non-interactively,
thus enabling work in the background, after having logged off.
Wget supports recursive retrieval of HTML pages, as well as FTP sites.
Wget supports proxy servers, which can lighten the network load, speed
up retrieval and provide access behind firewalls.

It works exceedingly well also on slow or unstable connections,
keeping getting the document until it is fully retrieved. Re-getting
files from where it left off works on servers (both HTTP and FTP) that
support it. Matching of wildcards and recursive mirroring of
directories are available when retrieving via FTP.  Both HTTP and FTP
retrievals can be time-stamped, thus Wget can see if the remote file
has changed since last retrieval and automatically retrieve the new
version if it has.

Install wget if you need to retrieve large numbers of files with HTTP or
FTP, or if you need a utility for mirroring web sites or FTP directories.

%description -l cs


%prep
%setup -q

%build
%configure --sysconfdir=/etc
make

%install
rm -rf $RPM_BUILD_ROOT
%makeinstall
gzip $RPM_BUILD_ROOT%{_infodir}/*

%post
/sbin/install-info %{_infodir}/wget.info.gz %{_infodir}/dir

%preun
if [ "$1" = 0 ]; then
    /sbin/install-info --delete %{_infodir}/wget.info.gz %{_infodir}/dir
fi

%clean
rm -rf $RPM_BUILD_ROOT

%files
%defattr(-,root,root)
%doc AUTHORS MAILING-LIST NEWS README INSTALL doc/ChangeLog doc/sample.wgetrc
%config /etc/wgetrc
%{_bindir}/wget
%{_infodir}/*
/usr/share/locale/*/LC_MESSAGES/*

%changelog
* Wed Jan  3 2001 Jan Prikryl <prikryl at cg.tuwien.ac.at>
- preliminary version for 1.7
- removed all RedHat patches from 1.5.3 for this moment

* Tue Aug  1 2000 Bill Nottingham <notting at redhat.com>
- setlocale for LC_CTYPE too, or else all the translations think their
  characters are unprintable.

* Thu Jul 13 2000 Prospector <bugzilla at redhat.com>
- automatic rebuild

* Sun Jun 11 2000 Bill Nottingham <notting at redhat.com>
- build in new environment

* Mon Jun  5 2000 Bernhard Rosenkraenzer <bero at redhat.com>
- FHS compliance

* Thu Feb  3 2000 Bill Nottingham <notting at redhat.com>
- handle compressed man pages

* Thu Aug 26 1999 Jeff Johnson <jbj at redhat.com>
- don't permit chmod 777 on symlinks (#4725).

* Sun Mar 21 1999 Cristian Gafton <gafton at redhat.com> 
- auto rebuild in the new build environment (release 4)

* Fri Dec 18 1998 Bill Nottingham <notting at redhat.com>
- build for 6.0 tree
- add Provides

* Sat Oct 10 1998 Cristian Gafton <gafton at redhat.com>
- strip binaries
- version 1.5.3

* Sat Jun 27 1998 Jeff Johnson <jbj at redhat.com>
- updated to 1.5.2

* Thu Apr 30 1998 Cristian Gafton <gafton at redhat.com>
- modified group to Applications/Networking

* Wed Apr 22 1998 Cristian Gafton <gafton at redhat.com>
- upgraded to 1.5.0
- they removed the man page from the distribution (Duh!) and I added it back
  from 1.4.5. Hey, removing the man page is DUMB!

* Fri Nov 14 1997 Cristian Gafton <gafton at redhat.com>
- first build against glibc




More information about the dslinux-commit mailing list