Friday, August 15, 2008

Spider hacking

Its time to do some serious stuff, but in small time interval, here is the tutorial:

Spider Hacking: this kind of hacking is also known as search engine hacking, bcoz search engines are infact the spiders and simple naive users call'm search engines, leave it..

Let's start with google search engine:


every search engine have some directives (mostly which provide the advance search) and use of these directives can be of immense help, like developing an on the fly disaster management program, exact information extraction system, etc... hackers use these techniques for identifying the victims within seconds and launch the attacks without even using any exploit. letrs do it, get set and finally go:

directives:

1. intext:
this directive can be used to search any text inside the document body, the multiword text must be placed inside the double quotes and single word no problem,
remember directives only in lower case and no space after and before":"

we can use "+" and "-" signs for more specific search, e.g:

intext:cars - intext:"maruti suzuki"
above query will search for all pages having cars written but NOT those pages which include the "maruty suzuky".

the data or text is not case sensitive.

2. inurl:
this directive is deadly and searches for those pages having specific portion in their url portion as follows:

inurl:admin

this will list all those pages having admin in their url (the address).

inurl:"/admin/order"
will list all pages which are inside the /admin/order and so on directory.

3. intitle:
this directive is also deadly and can retrieve all those pages which has the specified text inside their title.

note: we can also use the wild cards..e.g. "*" for group and "." for single character.

e.g:

intitle:"Microsoft IIS 3.0*"

this will retrieve all those pages which have "Microsoft IIS 3.0..." as their title
this will lead to finding vulnerable hosts in seconds,.....we got the victim"
(place the portion of exact title by opening such pages i find the titles of all vulnerable softwares default pages and vulnerability scanners generated report pages titles and search for them, a real victim scanner is developed, in few lines by u, isn't it...

type the following in google search box:

intitle:"index.of" inurl:admin

and check out the results.

next is ..
4. site:
this directive will search within a site or domain like this:
site:www.victim.com intitle:"index.of*"

will search for all those pages (actually in this case it will open the folder with browse property on).

note: to use inurl and site togather is mostly worthless.

5. inanchor:
will list all those pages which have the anchor text as specified...
e.g:

inanchor:"My Secrets"
will search for all those pages which have a hyperlink with text "My Secrets"

6. link:
the link provides all those pages which have a link for specified address. like href address in anchors.
6. filetype:

this directive is most deadliest and is used to search for specific file type like to search for excel file specify its extension withouit . as:

filetype:xls inurl:india

above search query will retrieve all excel files found in india directory or named india.
i like the following concept of the yahoo, in which we can find out whether a user is online or not, .. in internet browser in address bar type:

opi.yahoo.com/online?u=yahooID&m=g&t=2

where u=userID is a variable and needs the yahoo userID

m = method of status specification
"g" returns the graphic and "t" in text

t type of text or graphic
t can be from 1, 2, 3 and 2 for largest size, 1 for medium, 3 for smallest size.
e.g:

opi.yahoo.com/online?u=vinkat007&m=g&t=1

No comments: