Which FTP app for Ubuntu

The place to post if you need help or advice

Moderators: ChrisThornett, LXF moderators

Which FTP app for Ubuntu

Postby jer1ch0 » Fri Jul 08, 2005 10:56 pm

I'm using gFTP 2.0.18 at the moment.
It's good but not as good as Cute FTP in Windows. Is it the best and fastest programme you can get?
Don' get me wrong, it's good, but I'm just wondering what other people use or prefer here.
Thanks
jer1ch0
LXF regular
 
Posts: 135
Joined: Sat Apr 09, 2005 10:42 am
Location: Ireland

RE: Which FTP app for Ubuntu

Postby skecs » Sat Jul 09, 2005 6:44 am

I use gFTP but also use wget from the command line to grab very large files. I've just found it to be more efficient and much more feature rich than people think.

All features are listed below:
steve@linux:~> wget -h
GNU Wget 1.9.1, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
-V, --version display the version of Wget and exit.
-h, --help print this help.
-b, --background go to background after startup.
-e, --execute=COMMAND execute a `.wgetrc'-style command.

Logging and input file:
-o, --output-file=FILE log messages to FILE.
-a, --append-output=FILE append messages to FILE.
-d, --debug print debug output.
-q, --quiet quiet (no output).
-v, --verbose be verbose (this is the default).
-nv, --non-verbose turn off verboseness, without being quiet.
-i, --input-file=FILE download URLs found in FILE.
-F, --force-html treat input file as HTML.
-B, --base=URL prepends URL to relative links in -F -i file.

Download:
-t, --tries=NUMBER set number of retries to NUMBER (0 unlimits).
--retry-connrefused retry even if connection is refused.
-O --output-document=FILE write documents to FILE.
-nc, --no-clobber don't clobber existing files or use .# suffixes.
-c, --continue resume getting a partially-downloaded file.
--progress=TYPE select progress gauge type.
-N, --timestamping don't re-retrieve files unless newer than local.
-S, --server-response print server response.
--spider don't download anything.
-T, --timeout=SECONDS set all timeout values to SECONDS.
--dns-timeout=SECS set the DNS lookup timeout to SECS.
--connect-timeout=SECS set the connect timeout to SECS.
--read-timeout=SECS set the read timeout to SECS.
-w, --wait=SECONDS wait SECONDS between retrievals.
--waitretry=SECONDS wait 1...SECONDS between retries of a retrieval.
--random-wait wait from 0...2*WAIT secs between retrievals.
-Y, --proxy=on/off turn proxy on or off.
-Q, --quota=NUMBER set retrieval quota to NUMBER.
--bind-address=ADDRESS bind to ADDRESS (hostname or IP) on local host.
--limit-rate=RATE limit download rate to RATE.
--dns-cache=off disable caching DNS lookups.
--restrict-file-names=OS restrict chars in file names to ones OS allows.

Directories:
-nd, --no-directories don't create directories.
-x, --force-directories force creation of directories.
-nH, --no-host-directories don't create host directories.
-P, --directory-prefix=PREFIX save files to PREFIX/...
--cut-dirs=NUMBER ignore NUMBER remote directory components.

HTTP options:
--http-user=USER set http user to USER.
--http-passwd=PASS set http password to PASS.
-C, --cache=on/off (dis)allow server-cached data (normally allowed).
-E, --html-extension save all text/html documents with .html extension.
--ignore-length ignore `Content-Length' header field.
--header=STRING insert STRING among the headers.
--proxy-user=USER set USER as proxy username.
--proxy-passwd=PASS set PASS as proxy password.
--referer=URL include `Referer: URL' header in HTTP request.
-s, --save-headers save the HTTP headers to file.
-U, --user-agent=AGENT identify as AGENT instead of Wget/VERSION.
--no-http-keep-alive disable HTTP keep-alive (persistent connections).
--cookies=off don't use cookies.
--load-cookies=FILE load cookies from FILE before session.
--save-cookies=FILE save cookies to FILE after session.
--post-data=STRING use the POST method; send STRING as the data.
--post-file=FILE use the POST method; send contents of FILE.

HTTPS (SSL) options:
--sslcertfile=FILE optional client certificate.
--sslcertkey=KEYFILE optional keyfile for this certificate.
--egd-file=FILE file name of the EGD socket.
--sslcadir=DIR dir where hash list of CA's are stored.
--sslcafile=FILE file with bundle of CA's
--sslcerttype=0/1 Client-Cert type 0=PEM (default) / 1=ASN1 (DER)
--sslcheckcert=0/1 Check the server cert agenst given CA
--sslprotocol=0-3 choose SSL protocol; 0=automatic,
1=SSLv2 2=SSLv3 3=TLSv1

FTP options:
-nr, --dont-remove-listing don't remove `.listing' files.
-g, --glob=on/off turn file name globbing on or off.
--passive-ftp use the "passive" transfer mode.
--retr-symlinks when recursing, get linked-to files (not dirs).

Recursive retrieval:
-r, --recursive recursive download.
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after delete files locally after downloading them.
-k, --convert-links convert non-relative links to relative.
-K, --backup-converted before converting file X, back up as X.orig.
-m, --mirror shortcut option equivalent to -r -N -l inf -nr.
-p, --page-requisites get all images, etc. needed to display HTML page.
--strict-comments turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions.
-R, --reject=LIST comma-separated list of rejected extensions.
-D, --domains=LIST comma-separated list of accepted domains.
--exclude-domains=LIST comma-separated list of rejected domains.
--follow-ftp follow FTP links from HTML documents.
--follow-tags=LIST comma-separated list of followed HTML tags.
-G, --ignore-tags=LIST comma-separated list of ignored HTML tags.
-H, --span-hosts go to foreign hosts when recursive.
-L, --relative follow relative links only.
-I, --include-directories=LIST list of allowed directories.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent don't ascend to the parent directory.

Hope this helps.

8) - I've put the flame proof undies on!!
Regards from
Downunder!
.... _
... (0)>
... / /\
.. / / .)
.. V_/_
Linux Powered!
User avatar
skecs
 
Posts: 76
Joined: Fri Apr 22, 2005 9:22 am
Location: Bathurst, NSW Australia

RE: Which FTP app for Ubuntu

Postby fingers99 » Sat Jul 09, 2005 8:10 pm

Wget is great. But perhaps the greatest (conventional) ftp client (and maybe the prettiest application to use the gtk toolkit) is igloo ftp.
fingers99
LXF regular
 
Posts: 143
Joined: Thu Apr 07, 2005 6:15 pm

RE: Which FTP app for Ubuntu

Postby linuxgirlie » Sat Jul 09, 2005 8:15 pm

Ohhh, I use gftp, now I am a KDE person but Kbear is aweful, I do all my web designing via GFTP
My knowledge comes with no warranty...........

Server operating system designed for schools:http://www.linuxschools.com
linuxgirlie
LXF regular
 
Posts: 787
Joined: Sat Apr 09, 2005 6:34 pm
Location: Kent...UK

RE: Which FTP app for Ubuntu

Postby A-Wing » Sat Jul 09, 2005 8:53 pm

I tend to do uploading via scp/ssh and leeching via wget, when I need a gui ftp client (not often) I use gftp.
Andrew Hutchings, Linux Jedi
http://www.a-wing.co.uk
User avatar
A-Wing
LXF regular
 
Posts: 460
Joined: Tue Jul 05, 2005 7:25 pm
Location: Wellingborough

Re: RE: Which FTP app for Ubuntu

Postby nordle » Sat Jul 09, 2005 9:57 pm

linuxgirlie wrote:Ohhh, I use gftp, now I am a KDE person but Kbear is aweful, I do all my web designing via GFTP


If you like KDE, have you tried kftpgrabber, I have no idea if its any good, I use it but only for uploading files, thats it.

http://kftpgrabber.sourceforge.net/
User avatar
nordle
LXF regular
 
Posts: 1500
Joined: Fri Apr 08, 2005 9:56 pm

RE: Re: RE: Which FTP app for Ubuntu

Postby linuxgirlie » Sat Jul 09, 2005 10:00 pm

I haven't seen that, I'll have to DL it and give it a go...
My knowledge comes with no warranty...........

Server operating system designed for schools:http://www.linuxschools.com
linuxgirlie
LXF regular
 
Posts: 787
Joined: Sat Apr 09, 2005 6:34 pm
Location: Kent...UK


Return to Help!

Who is online

Users browsing this forum: No registered users and 0 guests