The Internet is a general purpose, public, global computer network which allows computers hooked into the Internet to communicate and exchange digital data with other computers also on the Internet. Once a computer is coupled to the Internet, a wide variety of options become available. Some of the myriad functions possible over the Internet include sending and receiving electronic mail (e-mail) messages, logging into and participating in live discussions, playing games in real-time, viewing pictures, watching streaming video, listening to music, going shopping on-line, browsing different web sites, downloading and/or uploading files, etc.
The most popular way of participating in the Internet involves a client/server arrangement. Basically, a server computer provides a service and acts as a host to any number of client computers wishing to avail themselves of that service. For instance, a user may wish to send an e-mail message to a friend. The user first logs his or her client computer, such as a personal computer (PC) on the Internet through a standard telephone modem, cable modem, digital subscriber line (DSL), etc. The user then composes the e-mail message on the client computer which then contacts and transmits the message over the Internet to a designated e-mail server computer. Subsequently, when the recipient checks for any new e-mail messages, the recipient's client computer will contact the e-mail server. The e-mail server will then proceed to send the new e-mail message to the recipient's client computer, again over the Internet. In many cases, a server simply contains content information (e.g., web pages displaying text and/or pictures, real-time stock quotes, etc.). A huge number of clients can access this content information via the Internet.
Referring to FIG. 1, a typical Internet client/server arrangement is shown. In this example, four clients 101-104 and two servers 105-106 are shown coupled to Internet 107. In general, clients 101-104 are personal computers (PC's), whereas servers 105-106 are more powerful computers with greater hardware, software, and connection resources. Any of the clients 101-104 can transmit and receive data to/from any of the servers 105-106 via Internet 107. Moreover, a single server can handle multiple client requests at the same time. Expanding upon this client/server arrangement, millions upon millions of client and server computers around the world are coupled to the vast Internet and are exchanging information, at any given time.
Presently, there are two major protocols used to establish and facilitate data transmissions between clients and servers. These protocols specify a set of technical rules by which client and server programs can communicate with one another. The first protocol is commonly referred to by its acronym, HTTP (Hypertext Transfer Protocol). HTTP is used to transfer data between servers and clients via a browser program (e.g., Navigator or Explorer) over a part of the Internet known as the World Wide Web or “the Web.” HTTP enables a user to simply place a cursor on a displayed hypertext link and click on it. This automatically takes the user to the appropriate web page, to other desired information, or to another resource located on the same or different server on the Internet.
The other widely adopted protocol is known as FTP (File Transfer Protocol). FTP enables users to readily transfer files between computers over the Internet. A file is a collection of data (e.g., e-mail messages, web pages, pictures, documents, computer programs, etc.) which is stored under a given name. FTP allows a client computer to download designated files from a server and also to upload files to a server. For example, a user can design and create a web site on a local client computer, store the web pages in one or more files, and then upload these files via FTP to a web server over the Internet. These files are stored on the server and potentially anyone can now access that web page over the Internet. Thereby, FTP servers enable the distribution of software programs and other files over the Internet.
Although HTTP and FTP confer great flexibility, ease of use, and functionality to users, there are several associated drawbacks which must still be addressed. One major headache and expense involves the administration, management, and general maintenance of the servers. Ideally, the files or content stored on the servers should be secured against unauthorized users. Furthermore, whereas some users are granted permission to access the content, they should be prevented from accidentally or intentionally corrupting or otherwise altering the content stored on the servers. At the same time, legitimate owners of the content should be given permission to update or change their content as needed. It is a rather difficult task to monitor and enforce this delicate balance, especially in light of unauthorized users who attempt to crack or hack their way into secure servers. Moreover, in order to leverage the power of most server systems, a single server is often used to support an environment whereby multiple, independent file systems exist. In effect, many different users can share a single server. This necessarily entails setting up multiple accounts—one account per user. Creating multiple accounts opens up the server system to more potential abuses by unauthorized persons.
On the one hand, server systems administrators want to grant legitimate users the ability of performing certain useful commands for administering their own virtual file systems within the server. Otherwise, the administrators themselves are face with the overwhelming workload of having to manually and directly perform a myriad of trivial tasks for legitimate users who wish to deploy content and applications onto these servers. But on the other hand, server systems administrators would like to deny direct operating-system level access to remote clients in order to minimize security risks and to also minimize security administration overhead.
Another related problem pertains to the fact that HTTP and FTP were designed to meet different needs. As such, these two protocols are used independently. However, with the explosion of e-commerce over the Internet, it is becoming ever more prevalent for users to utilize both protocols. For instance, rather than selling software through traditional shrink wrap packages at stores, it is becoming more cost efficient to purchase and sell software over the Internet. A customer can use a search engine to find the various sites which are offering the software product for sale. The customer can readily access these sites via HTTP to shop for the best bargain. The customer can then place an order over the Internet via HTTP. After verifying payment, the software program is then downloaded from the server to the buyer's client computer via an FTP file transfer. For the casual computer user, it may be a bit too daunting to master proficiency in both HTTP and FTP required to complete an e-commerce transaction. Furthermore, traditional businesses may have a difficult time finding the HTTP and FTP expertise necessary for transitioning into a more competitive e-commerce offering.
Thus, there is a need in the prior art for a method which removes some of the administrative burden of managing servers. There also exists a need in the prior art for improving the integrity of server systems. It would also be preferable if such a method could also somehow simplify the HTTP/FTP process inherent in e-commerce transactions. The present invention proposes a unique, novel, and elegant solution which satisfies all the above needs.