Friendly Expert Computer Help - In Plain English
PC911
As Seen On TechTV
 
Google
 
PC911 > How-To > Internet > The Basics of FTP

- Grogan -

Download this article as a text file (Self-Extracting Zip)
View this article in printer-friendly plain-text format
E-mail this article to a friend

What is FTP?

FTP, which is an acronym for File Transfer Protocol, is a client-server protocol that allows users to transfer files to and from other computers over a TCP/IP network. FTP has its origins at M.I.T. around 1971, and while it has evolved somewhat over the years, it's one of the oldest protocols that is still in use and dates back to the days of Unix mainframe computers and dumb terminals. For example, a student at a university would log on to their shell on a campus server with their terminal in the computer lab and invoke an FTP client program to connect to another computer and download course curriculum or research documents, to their home directory. If you were dialing in to the remote shell from home, you could use your terminal program and invoke ZMODEM to transfer the file from your home directory to your local computer.

Today, it's still used extensively on that big wide area TCP/IP network that we call The Internet. If you're not familiar with FTP, chances are you've used it many times without even giving it a second thought. Many of the download links you click are actually URLs that point to a file on a computer that is acting as an FTP server; your Web browser automatically downloads them.

Why do they still use FTP?

Quite simply, it is still widely used because it is the most stable and reliable protocol for transferring files over the Internet. This means your download has a greater chance of completing without transfer errors, and intact without corruption. In the FTP specifications, there are strict controls for when the server breaks the FTP session connection with the client, and also strict controls to mark the end of the file data (EOF). Standards are what make platform independent technologies like the Internet possible. Authors of FTP server software, and client software must adhere to the FTP protocol standards (or they won't be in business long). This rationale ensures reliable communications.

FTP also uses less overhead than other file transfer mechanisms, with fewer packets being sent back and forth to perform the file transfer. The main reason for this is that FTP is able to download files in binary form. When you download a file from a Web server using HTTP (Hyper Text Transfer Protocol), or send/receive a file as an e-mail attachment, the data is first encoded in MIME (no, it's not a clown that doesn't talk, it stands for Multipurpose Internet Mail Extensions). Basically, that means your file is encoded as text in the data stream, and converted back to binary on your end. The encoding adds considerable overhead to the transmission.

TCP/IP itself provides exemplary error correction control in that packets have an internal checksum that is compared at the receiving end. If a packet in the sequence is missed, or fails the checksum, the receiving computer initiates a request for retransmission. This is what makes TCP/IP reliable. If not for this mechanism we'd seldom successfully download anything from the chaotic public Internet. Now, you are always using TCP/IP over the Internet and always have this error correction no matter what mechanism you use to download files, but I thought it prudent to mention this here, seeing as FTP relies on it.

The modern FTP specification also has a mechanism for resuming broken downloads. On the server side, markers are inserted in the data streams that correspond to specific locations in the files. Only the server needs to know the specific mechanism used to mark position. On the client side, when your FTP client program requests a resume (usually automatically when you restart a broken download and save to the same directory), it sends the last marker code to the server, which determines the position. Your download then resumes where it left off. It sure beats starting over.

Why do I need to have FTP Client Software?

Proper FTP client software gives you more control over what you are doing. Web browsers don't make very good FTP clients, and their mechanisms to resume broken downloads aren't very reliable (assuming they exist, that is). There is also a greater risk of encountering transfer errors. I don't know how many times I've clicked on an FTP URL to download a large file with the Netscape browser only to have my download stall at the last few bytes, or abort with a transfer error. Cancel the transfer and start it again, and it starts over from the beginning. Web browsers are really only designed to work as Anonymous FTP clients and if you intend to administer a web site, you will want to use FTP to transfer your files to the directory on the Web server. You will not be able to do that with a web browser.

Where can I get FTP Client Software?

A very good, no strings attached, free (for non-commercial use) FTP Client that supports resuming broken downloads, is WS_FTP LE from Ipswitch Inc. The Pro version is not free. Download the free LE version of WS_FTP for your OS (e.g. Win32) from:

http://download.cnet.com/downloads/0-10064-101-1572132.html

The tutorial that follows will explain the basic steps to use WS_FTP LE.

   
Back To Top Of Page