Inside Visual Basic Magazine,March 2000
Reposted with Permission of ZD Net Journals
There's no arguing that the Internet lets us access
amazing volumes of information on virtually any subject. However, if you're like
us, you may have found it difficult to filter out unnecessary information from
this enormous repository. Gathering specific facts can be time consuming, with
data usually scattered across many sites. Search engines like Yahoo!, HotBot,
and even Ask Jeeves, have attempted to fill this void, but have been only
partially successful. A recent study found that search engines have indexed less
than 55 percent of the Web. The same study predicted that this percentage would
in fact continue to shrink as the number of new pages on the Internet grows.
In the future, people will probably turn to personal,
automated search programs to find what they need. These Web-bots provide more
targeted and thorough searches. In this article, we'll look at the Web-bot shown
in Figure A, which lets you research any topic on the Internet. Then, we'll
cover a few of the basics you'll need to create a Web-bot fit to rival Jeeves
To boldly go where no Web-bot has gone before
We included both the Web-bot's project files and a compiled
EXE in this month's download. For now, launch the EXE. To begin, enter the
subject you want to research in the Subject text box. For our example, we
satisfied our Star Trek craving.
Next, indicate how thorough a search you want the bot to
conduct in the Search Depth text box. High numbers make for in-depth searches,
but take longer to complete. Lower numbers are less thorough but complete much
quicker. If you have a slow Internet connection and only a few minutes to run
the Web-bot, consider entering a 2 or 3. If you have a fast Internet connection
or have a lot of time (for example, you may be running the program over-night),
enter a higher number like 9 or 10. The Web-bot doesn't care how high you make
this number. As you can see in Figure A, we entered 3 for our search depth.
Full speed ahead, botty
Now, select the Show In Browser check box. This option lets
you monitor the bot's progress in the right browser window. The other browsing
check box, Stop Each Page, pauses the Web-bot after each page to allow you to
monitor the results. Chances are, if you want to run the bot unattended, you
won't want to use this option.
Finally, tell the Web-bot where to start. Search engines
can be good launching points, so if you want to start with one of these, choose
the corresponding option button. If you want to start at a custom URL, click the
Custom URL option button, and then enter the URL in the text box.
Now that we've set the Web-bot's options, we're ready to
launch it. To do so, click Start Search, and then click Yes when the program
asks if you're conducting a new search. That done, the Web-bot races ahead at
warp speed, looking for the information you requested. (OK, that's the last of
the Star Trek references, promise!)
At any time, if you wish to take a closer look at a URL,
just click the Pause button. Then, find a URL in the treeview and right-click on
it. Doing so transports the page into the browser on the right side. The program
also logs email addresses, as well as the URLs, in a local Access 97 database
for your later perusal. We called this database WebAgent.mdb.
The anatomy of a Web-bot
Now that we've looked at a working Web-bot, let's take a
look at some of the necessary features that you'll need when you create your
own. For space considerations, we won't get into the form's exact design.
However, Figure A should provide a blueprint for your own layout.
In addition to the controls visible at runtime, Figure B
shows the few controls not visible. As you can see, we've placed an ImageList
and Inet control on the form. Also, the larger box at the very bottom is an
RTFTextbox control. Finally, note that in the main body of the Web-bot, we used
a Treeview to list the Web sites and email addresses, and a Browser control to
display the pages. Now, let's take a look at the more complex features.
Figure B: We'll import HTML pages into the
RTFTextbox control, and then use its Find method to search the HTML for the
Navigating to a Web page
The program gains its ability to load Internet Web pages
from the Microsoft Internet control (shdocvw.oca). To use it, simply drop the
control onto a form and use the
Navigate method. In our Web-bot,
mNavigateToURL accomplishes this task, as well as
provides time-out error trapping and the code to transfer raw HTML to the
RTFTextbox control for later use. Listing A shows the code for this procedure.
vstrURL contains the URL that the Web-bot is currently
Listing A: Navigating to a URL
Function mNavigateToURL(ByRef rIntInternetControl
As Inet, ByRef rbrwsBrowserControl As WebBrowser, _
ByRef rrtfTextBox As RichTextBox, ByRef vstrURL _
As String) As Boolean
mNavigateToURL = False
On Error GoTo lblOpenError
rIntInternetControl.URL = vstrURL
rIntInternetControl.AccessType = icDirect
frmWebBot.sbWebBot.Panels(1).Text = "Loading "
& vstrURL & "..."
rrtfTextBox.Text = rIntInternetControl.OpenURL
frmWebBot.sbWebBot.Panels(1).Text = ""
On Error GoTo 0
If (frmWebBot.chkShowInBrowser = vbChecked) Then
mNavigateToURL = True
Select Case (Err.Number)
Displaying Web pages
Once the Inet control loads a page, the Web-bot needs to
display it in the right pane of the main control panel. The Microsoft Web
Browser control (located in the same control library as the Internet control we
just mentioned) makes it very easy to do so. The following code causes the
browser to display the current page:
Analyzing a page
After loading and displaying a page, the Web-bot reads it.
Our particular Web-bot requires two different pieces of information:
As you'll recall from
- The email addresses located on the page.
- The links that exit the page, so the Web-bot can
continue its journey.
Web-bot stores the raw HTML for the page in a Rich Text Box control,
The control's built in
Find method allows the Web-bot to perform
some rudimentary searching, but the procedure must also parse the HTML document
from a specific starting and ending delimiter, and extract the text that lies in
between. We created the
mExtractHTML function in Listing B to
accomplish this task. If it finds what it's looking for, it returns the HTML
contents. Otherwise, it returns the empty string.
Listing B: The mExtractHTML function
Function mExtractHTML(ByVal vstrStartDelimiter _
As String, ByVal vstrEndDelimiter As String, _
ByRef rrtfHtml As RichTextBox, ByRef _
rrlngPageIndex As Long) As String
Dim lngStringStart As Long
Dim lngStringEnd As Long
On Error GoTo lblError
If (vstrStartDelimiter <> "") Then
rrlngPageIndex = rrtfHtml.Find(vstrStartDelimiter, _
rrlngPageIndex + 1)
lngStringStart = rrlngPageIndex + _
'start at current position
lngStringStart = rrlngPageIndex
'find ending delimiter
rrlngPageIndex = rrtfHtml.Find(vstrEndDelimiter, _
lngStringStart + 1)
lngStringEnd = rrlngPageIndex - 1
rrtfHtml.SelStart = lngStringStart
rrtfHtml.SelLength = lngStringEnd - lngStringStart + 1
mExtractHTML = rrtfHtml.SelText
'set output value
rrlngPageIndex = lngStringEnd + Len(vstrEndDelimiter)
On Error GoTo 0
mExtractHTML = "ERROR"
build on the previous function and return the links or email addresses
(respectively) back to the calling routine via a collection. These functions are
smart enough to remove links and email addresses that might appear valid to a
less sophisticated Web-bot, but really wouldn't be applicable. For example, most
email addresses to mailing lists are of the format firstname.lastname@example.org. The
routine weeds these out. Other examples of screened email addresses include
email@example.com and firstname.lastname@example.org.
Avoiding infinite loops
Some pages either link back to themselves or link to other
pages that eventually loop back to the original page. If a Web-bot doesn't keep
an eye out for such pages, it can easily fall into an infinite loop. To avoid
this trap, our Web-bot does two things. First, it uses the function
to store every URL in the Access database. As you can see if you view the code
in this month's download, this function uses standard ADO code for saving data
to a database.
Second, before going to any new URL, it determines if it
already visited the page. To do so, it calls
shown in Listing C. If the database contains the URL, then the Web-bot skips the
page, thus short-circuiting the infinite loop.
Listing C: Code to detect duplicate URL
Function mblnAlreadyVisiting(ByVal vstrURL As
Dim objConnection As ADODB.Connection
Dim objRecordset As ADODB.Recordset
'connect to database
Dim strSQL As String
strSQL = "SELECT * FROM WebBot_Visited_Url " _
& "WHERE url='" & vstrURL &
Set objRecordset = New ADODB.Recordset
On Error GoTo lblOpenError
objRecordset.Open strSQL, objConnection, _
On Error GoTo 0
If objRecordset.EOF = False Then
mblnAlreadyVisiting = True
mblnAlreadyVisiting = False
Set objRecordset = Nothing
Resuming operation after stopping
Should anything unforeseen happen during a Web-bot search,
such as the operating system crashing or the computer getting switched off, the
search would normally have to be completely rerun. However, this would not be a
happy prospect for someone who was a few hours, or days, into a search, so the
Web-bot code is built to handle this contingency.
To allow the user to resume his search, the Web-bot uses
the same URL log that protects against infinite loops to keep track of the
currently visited URL. If the application gets prematurely shut down, it will
simply pick up where it left off.
Web-bots make the Web infinitely more useful because they
allow you to pull in more information than a mere search engine, and allow you
to gather the information into a useful format. The uses for a Web-bot are only
limited by your imagination, and with this article, you now have the tools to
build whatever you can dream