Security Hole on webservers run on variuos OS, How to close UNIS hole

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Mark Worsdal » Thu, 22 Jan 1998 04:00:00



Hi,

I have written a program that enables me via the web, to access any file
(access is see it and download it) on a webservers hard drive. It
basically exlpores the drive (/ root downwards). Now the Apache Server
that run my program delivers the results to my browser. I wrote it
initailly to test the security of our account.

So what permissions should I use on our web directory:-

/usr/www/foobar

That will still allows the server to deliver the webpages and yet stop
people from other accounts from accessing it via chdir (telnet/ftp)?

We also hove home directoys that are not visible to the www called:-
/usr/home/foobar

Now my programs can access this directory too! So what sould I set the
file permissions too, again to stop other account holders with our ISP
from accessing it?

For the record my program traversed the enitire Disc that the web server
is running on. It was only denied access to a few directorys.

Also this mean that no logs are kept!! As my 2nd program called by the
first program, downloads the actual file to my browser and the only log
kept ofcousre is that someone ran the download program.

Is this what Goverments do to obtain without permission peoples files?:-

Say they are targetting http://www.foobar.co.uk, they can do this:-

1) Find out who the ISP is for http://www.foobar.co.uk
2) Buy a cgi account with the same ISP
3) Use a program like mine. (Will not give details publically)
4) Run the program and download the files including any within NON web
directorys.
--
Mark Worsdall
Home  :- shadowwebATworsdall.demon.co.uk
Any opinion given is my own personal belief...

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Barry Margoli » Thu, 22 Jan 1998 04:00:00




Quote:>I have written a program that enables me via the web, to access any file
>(access is see it and download it) on a webservers hard drive. It
>basically exlpores the drive (/ root downwards). Now the Apache Server
>that run my program delivers the results to my browser. I wrote it
>initailly to test the security of our account.

Your program works with any web server?  I find this difficult to believe.
Most web servers try hard to allow access only to certain directories.
While I'm sure some HTTP servers have bugs that allow you to get out of
this part of the filesystem, there are enough different servers with
different bugs that I doubt your one client program can defeat them all.

Can you describe how your program does this?

Quote:>So what permissions should I use on our web directory:-

>/usr/www/foobar

>That will still allows the server to deliver the webpages and yet stop
>people from other accounts from accessing it via chdir (telnet/ftp)?

You should make it readable only by the account that the webserver runs
as.  This seems to be the opposite of what you're describing above (this
directory is *supposed* to be reachable via the web server, so it's not a
problem if your program can access it).

Quote:>We also hove home directoys that are not visible to the www called:-
>/usr/home/foobar

>Now my programs can access this directory too! So what sould I set the
>file permissions too, again to stop other account holders with our ISP
>from accessing it?

Usually there's something in the web server configuration file that
specifies what directories it can access.  Maybe your web server is
misconfigured.

Quote:>Also this mean that no logs are kept!! As my 2nd program called by the
>first program, downloads the actual file to my browser and the only log
>kept ofcousre is that someone ran the download program.

Most web servers log every request they handle.  I don't know what you mean
by the log saying that someone ran the download program -- web servers
don't know what program the client is running.

Quote:>Is this what Goverments do to obtain without permission peoples files?:-

>Say they are targetting http://www.foobar.co.uk, they can do this:-

>1) Find out who the ISP is for http://www.foobar.co.uk
>2) Buy a cgi account with the same ISP
>3) Use a program like mine. (Will not give details publically)
>4) Run the program and download the files including any within NON web
>directorys.

Oh, your program runs on the web server, it's not a web client.  So it
can't access any web server, only the one it's installed on.

Many ISP's will not allow CGI use on shared web servers, only on web
servers dedicated to a single client.  And some web servers can be
configured to change to a different userid depending on the virtual host
that was accessed; customers can protect their files by making them
readable only by their userid -- your CGI will only be able to read
world-readable files.  Some web servers can even use chroot() to restrict
the parts of the filesystem that the server can access.

--

GTE Internetworking, Powered by BBN, Cambridge, MA
Support the anti-spam movement; see <http://www.cauce.org/>
Please don't send technical questions directly to me, post them to newsgroups.

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Mark Worsdal » Thu, 22 Jan 1998 04:00:00




Quote:

>Oh, your program runs on the web server, it's not a web client.  So it
>can't access any web server, only the one it's installed on.

Yes that is correct. It is what I emant all the way through but my
descriptive powers are flawed:)

Quote:

>Many ISP's will not allow CGI use on shared web servers, only on web
>servers dedicated to a single client.  

So an ISP such as demon.co.uk who do not allow cgi with their standard
accounts would place a cgi account, if they did such an account, on a
different drive or run a web server on a totally seperate box? So they
would have something like this:-
===========================
Standard web accounts, with no cgi usage written by the client=
1 Web Server, serving webpages/users:-

http://www.foobar1.deamon.co.uk
http://www.goobar1.deamon.co.uk
http://www.hoobar1.deamon.co.uk
http://www.ioobar1.deamon.co.uk

Pages physically stored maybe (not important to this thread)
/usr/www/f/a/ etc etc etc
/usr/www/f/o/o/foobar1

etc etc etc

/usr/www/i/a/ etc etc etc
/usr/www/i/o/o/ioobar1
===========================
Where as various accounts with cgi usage written by the client=
seperate Web Servers, serving each accounts webpages:-

http://www.foobar1.deamon.co.uk
Pages physically stored maybe (not important to this thread) on box
number 1 which is running a web server. Path could be:-
/usr/www

etc etc etc

http://www.ioobar1.deamon.co.uk
Pages physically stored maybe (not important to this thread) on box
number 4 which is running a web server. Path could be:-
/usr/www
===========================

I think NOT, as I can hardly see an ISP have a seperate box per cgi
account unless they have a seperate drive or box per cgi account, with
all accounts seerved from one server wheich has access to each seperate
drive or box containing each cgi account.

If however the cgi account holder called hoobar1 has his webpages on the
same drive as the other account holders:-

http://www.foobar1.deamon.co.uk (standard, no cgi)
/usr/www/f/a/
/usr/www/f/o/o/foobar1
/usr/www/f/o/o/foobar1/index.html
/usr/www/f/o/o/foobar1/non-linked_file
/usr/logs/f/o/o/foobar1/access_log
/usr/ftp/f/o/o/foobar1/

http://www.goobar1.deamon.co.uk (standard, no cgi)
/usr/www/i/a/
/usr/www/i/o/o/ioobar1

http://www.hoobar1.deamon.co.uk (full cgi)
/usr/www/i/a/
/usr/www/i/o/o/ioobar1
/usr/www/i/o/o/ioobar1/index.html
/usr/www/i/o/o/ioobar1/cgi-bin (containing my prorgams)

http://www.ioobar1.deamon.co.uk (standard, no cgi)
/usr/www/i/a/ etc etc etc
/usr/www/i/o/o/ioobar1

In this case my 2 programs work as I suggest! I have an initial html
page that uses GET to pass the requested stuff:-

http://www.hoobar1.co.uk/cgi-bin/cgiwrap/hoobar1/program?up=treepath
&treepath=%2F&files=on&recurse=off

?up=treepath

&treepath=%2Fusr (/usr)
&files=on (Display files=on)
&recurse=on (Recurse directorys=on)

So the program extracts from the GET method the path requested and
options. It then does a read dir and displays what it reads back to the
browser. Then you can click on the directory or file, if on a directory
the same program is ran again if a file is clicked on, another program
is called which either displays the file or downloads it.

Quote:>And some web servers can be
>configured to change to a different userid depending on the virtual host
>that was accessed; customers can protect their files by making them
>readable only by their userid -- your CGI will only be able to read
>world-readable files.  Some web servers can even use chroot() to restrict
>the parts of the filesystem that the server can access.

Not too sure about what you mean here. How do I test my legitamisy
without causing problems, i.e. I had best not give the 2 programs over
publically?

Mark.
--
Mark Worsdall (Webmaster) - WEB site:- http://www.shadow.org.uk
Shadow:- webmasterATshadow.org.uk    
Home  :- shadowwebATworsdall.demon.co.uk
Any opinion given is my own personal belief...

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Barry Margoli » Fri, 23 Jan 1998 04:00:00




Quote:>I think NOT, as I can hardly see an ISP have a seperate box per cgi
>account unless they have a seperate drive or box per cgi account, with
>all accounts seerved from one server wheich has access to each seperate
>drive or box containing each cgi account.

This is why most consumer-oriented ISP's don't allow customers to install
CGI's.

Quote:>>And some web servers can be
>>configured to change to a different userid depending on the virtual host
>>that was accessed; customers can protect their files by making them
>>readable only by their userid -- your CGI will only be able to read
>>world-readable files.  Some web servers can even use chroot() to restrict
>>the parts of the filesystem that the server can access.

>Not too sure about what you mean here. How do I test my legitamisy
>without causing problems, i.e. I had best not give the 2 programs over
>publically?

If you're trying to find a web service provider that will protect your
files so that they can't be seen by the other customers' CGI scripts, you
can ask them when you inquire about getting an account.  They should be
able to tell you whether they run each virtual host as a separate user, so
that you can restrict access to your files to your own CGIs.

--

GTE Internetworking, Powered by BBN, Cambridge, MA
Support the anti-spam movement; see <http://www.cauce.org/>
Please don't send technical questions directly to me, post them to newsgroups.

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Mark Worsdal » Fri, 23 Jan 1998 04:00:00




Quote:

>If you're trying to find a web service provider that will protect your
>files so that they can't be seen by the other customers' CGI scripts, you
>can ask them when you inquire about getting an account.  They should be
>able to tell you whether they run each virtual host as a separate user, so
>that you can restrict access to your files to your own CGIs.

So on a virtual host as a seperate user but I have my own cgi scripts,
you are saying that once the script is ran it would die due to errors as
if it was told to access from the root say /usr or /

So how do they stop a script which is running on the server (remembering
that the server just runs the script directly unless all virtual hosts
run such scripts through localhost / loopback (127.0.0.1) somehow?

Mark

--
Mark Worsdall (Webmaster) - WEB site:- http://www.shadow.org.uk
Shadow:- webmasterATshadow.org.uk    
Home  :- shadowwebATworsdall.demon.co.uk
Any opinion given is my own personal belief...

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Marc Slemk » Fri, 23 Jan 1998 04:00:00





>>If you're trying to find a web service provider that will protect your
>>files so that they can't be seen by the other customers' CGI scripts, you
>>can ask them when you inquire about getting an account.  They should be
>>able to tell you whether they run each virtual host as a separate user, so
>>that you can restrict access to your files to your own CGIs.
>So on a virtual host as a seperate user but I have my own cgi scripts,
>you are saying that once the script is ran it would die due to errors as
>if it was told to access from the root say /usr or /

Erm... no, since those are normally publicly readable directories anyway.
However, if the virtual host _runs_ as a distinct user for each
vhost, then each user can make their files unreadable by anyone other
than the user their vhost runs as.

Or, if you simply run CGIs as another user (ie. and the server runs
as the same uid for all vhosts) you can make your files group readable
by a group that only the web server is in.

If they want to stop what you describe, they simply run the server or
the CGIs in a chroot()ed environment, so they can only access a
specified subset of the filesystem.  Paranoid people like me run
web servers chrooted anyway, even though I don't allow users to
run CGI, I reasonably trust Apache and have gone through the full
source tree and done a security audit several times, and users have
no access other than (chrooted) ftp access to the server.

The fact that CGI scripts can read files is nothing new and has a lot
less to do with web servers than with the way Unix works.

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Mark Worsdal » Fri, 23 Jan 1998 04:00:00




Quote:>I reasonably trust Apache and have gone through the full
>source tree and done a security audit several times, and users have
>no access other than (chrooted) ftp access to the server.

>The fact that CGI scripts can read files is nothing new and has a lot
>less to do with web servers than with the way Unix works.

Ah, well this is where I first thought up the script, ftp. The server I
run on is also an Apache server.

--
Mark Worsdall (Webmaster) - WEB site:- http://www.shadow.org.uk
Shadow:- webmasterATshadow.org.uk    
Home  :- shadowwebATworsdall.demon.co.uk
Any opinion given is my own personal belief...

 
 
 

Security Hole on webservers run on variuos OS, How to close UNIS hole

Post by Mark Worsdal » Sun, 25 Jan 1998 04:00:00


Hi more info gathered:-

Quote:>>The fact that CGI scripts can read files is nothing new and has a lot
>>less to do with web servers than with the way Unix works.

If called from the cgi-bin then you can explore your ISP's hard drive
but a lot of directorys you cannot view the contents of, still there is
a large amount of data still available that you would normally not
easily find but with my program there are all easily got at and
downloaded in a civilised manor.

BUT...
call it from your own cgiwrap as yourself as a user and bingo treble if
not more of the directorys become available.

If have found out this is beacuse the Apache server needs access to
directorys where we will store our members.access files etc, in places
that are not normally world wide available through URL browsing, they
are however available via my program.
--
Mark Worsdall
Any opinion given is my own personal belief...

 
 
 

1. Closing suid root security holes forever

It would seem that a large class of UNIX security holes are based
on tricking some suid program into writing to files which it's
not supposed to.

Swapping euid and uid works, but is error - prone (out of 100
opens, you'll probably only secure 99).

I would suggest adding a special file type to make this impossible.
Suppose we have a special bit associated with each file (let's
call it the "system bit"), turned on for all security - related
files (/etc/passwd, /etc/hosts.equiv, /etc/inetd.conf, ...)

The system insures that they have to be opened with a special
flag for writing, say with

    open("/etc/passwd", O_RDWR | O_SYSTEM);

Unless the O_SYSTEM flag is present, all processes (including, and
especially those with root privileges) get 'permission denied'.

This would mean that a cracker who has subverted some suid root
program into appending a new root id to /etc/passwd won't
succeed, because said suid root program won't open any random
file with that flag set.

Programs like 'useradd', 'userdel' etc. will open /etc/passwd with
this flag, but those won't be userid root; the most common editors
could be patched easily to detect the presence of this flag.

Bang, goes one class of security risks inherent in UNIX systems.

This would seem to be such a trivial hack to, let's say, Linux, that
I'd be surprised if nobody had thought of this before; yet, at least
on the UNIX systems I know, nobody does this.  (I once heard something
about BSD 4.4 implementing something called 'immutable files', which
may be related, but I dont't know BSD 4.4).

Comments?
--

The joy of engineering is to find a straight line on a double
logarithmic diagram.

2. Powermac 9500 no startup

3. HUGE security hole ! How to close?

4. Solaris driver question

5. Is there any security holes on Unix Sun Os 5.6?

6. another cdda question :)

7. Copying files with holes in them, without filling the holes

8. Backup-miracle?

9. Black Hole / Sink Hole Routing

10. best-of-security mailing list (was: Solaris 2.5 Security Hole: local users can get root)

11. Really serious security hole in Microport Unix (Re: SECURITY BUG IN INTERACTIVE UNIX SYSV386)

12. Ok.... so close... ping black hole...?