WWW Statistics for LAPACK
A new visits is defined as each new incoming visitor (viewing or browsing a page) who was not connected to your site during last 60 mn.
Number of client hosts (IP address) who came to visit the site (and who viewed at least one page).
This data refers to the number of different physical persons who had reached the site.
Number of times a page of the site is viewed (Sum for all visitors for all visits).
This piece of data differs from "hits" in that it counts only HTML pages as oppose to images and other files.
Number of times a page, image, file of the site is viewed or downloaded by someone.
This piece of data is provided as a reference only, since the number of "pages" viewed is often prefered for marketing purposes.
This piece of information refers to the amount of data downloaded by all pages, images and files within your site.
Units are in KB, MB or GB (KiloBytes, MegaBytes or GigaBytes)
Awstats recognizes each access to your site after a search from the 228 most popular Internet Search Engines and Directories (such as Yahoo, Altavista, Lycos, Google, Voila, etc...).
List of all external pages which were used to link (and enter) to your site (Only the 10 most often used external pages are shown). Links used by the results of the search engines are excluded here because they have already been included on the previous line within this table.
This table shows the list of the most frequent keyphrases or keywords used to find your site from Internet Search Engines and Directories. (Keywords from the 228 most popular Search Engines and Directories are recognized by Awstats, such as Yahoo, Altavista, Lycos, Google, Voila, etc...).
Note that total number of searches for keywords might be greater than total number of searches for keyphrases (real number of searches) because when 2 keywords were used on same search, search is counted twice for keywords (once for each word).
Robots (sometimes refer to Spiders) are automatic computer visitors used by many search engines that scan your web site to index it and rank it, collect statistics on Internet Web sites and/or see if your site is still online.
Awstats is able to recognize up to 643 robots.
All time related statistics are based on server time.
Here, reported data are: average values (calculated from all data between the first and last visit in analyzed range).
Here, reported data are: cumulative sums (calculated from all data between the first and last visit in analyzed range).
Some Visits durations are 'unknown' because they can't always be calculated. This is the major reason for this:
- Visit was not finished when 'update' occured.
- Visit started the last hour (after 23:00) of the last day of a month (A technical reason prevents Awstats from calculating duration of such sessions)
Worms are automatic computer visitors that are in fact external servers, infected by a virus, that try to make particular hits on your server to infect it. In most cases, such worms exploit some bugs of not up to date or commercial servers. If your system is not the sensitive target of the worm, you can simply ignore those hits.
There is very few 'server worms' in the world but they are very active at some times. Awstats is able to recognize 0 known worm's signatures (nimda,code red,...).
No description for this error.
Request was understood by server but will be processed later.
Server has processed the request but there is no document to send.
Partial content.
Requested document was moved and is now at another address given in answer.
No description for this error.
Syntax error, server didn't understand request.
Tried to reach an URL where a login/password pair was required.
A high number within this item could mean that someone (such as a hacker) is attempting to crack, or enter into your site (hoping to enter a secured area by trying different login/password pairs, for instance).
Tried to reach an URL not configured to be reachable, even with an login/password pair (for example, an URL within a directory not defined as "browsable".).
Tried to reach a non existing URL. This error often means that there is an invalid link somewhere in your site or that a visitor mistyped a certain URL.
Server has taken too much time to respond to a request. This error frequently involves either a slow CGI script which the server was required to kill or an extremely congested web server.
Internal error. This error is often caused by a CGI program that had finished abnormally (coredump for example).
Unknown requested action.
Code returned by a HTTP server that works as a proxy or gateway when a real, targeted server doesn't answer successfully to the client's request.
Internal server error.
Gateway Time-out.
HTTP Version Not Supported.
 
Statistics for: www.netlib.orgAwstats Web Site
Last Update: 07 Feb 2013 - 14:59
Reported period:Year 2012
Close window
 
Last visit  
107 different robotsHitsBandwidthLast visit
Mail.ru bot3,623977.15 MB31 Dec 2012 - 22:59
Googlebot718,22154.51 GB31 Dec 2012 - 22:59
BaiDuSpider218,8369.51 GB31 Dec 2012 - 22:59
Unknown robot (identified by 'bot*')250,96015.19 GB31 Dec 2012 - 22:57
Unknown robot (identified by empty user agent string)13,07128.32 GB31 Dec 2012 - 22:53
Sogou Spider19,104492.60 MB31 Dec 2012 - 22:53
Unknown robot (identified by 'robot')394,43223.98 GB31 Dec 2012 - 22:52
Yandex bot610,49110.55 GB31 Dec 2012 - 22:52
Voila53,6381.55 GB31 Dec 2012 - 22:38
yacy516155.15 MB31 Dec 2012 - 22:28
MSNBot-media1,6517.17 MB31 Dec 2012 - 22:23
Exabot5,059181.72 MB31 Dec 2012 - 22:07
Unknown robot (identified by '*bot')375,0586.29 GB31 Dec 2012 - 22:05
Perl tool4,798482.58 MB31 Dec 2012 - 21:49
WGet tools935,415278.67 GB31 Dec 2012 - 21:49
Java (Often spam bot)7,4848.13 GB31 Dec 2012 - 21:11
Unknown robot (identified by 'spider')16,8392.77 GB31 Dec 2012 - 21:06
WordPress3,107904.85 MB31 Dec 2012 - 20:30
Python-urllib174,2248.31 GB31 Dec 2012 - 19:26
Yahoo Slurp6,1051.54 GB31 Dec 2012 - 16:36
MJ12bot18,955397.09 MB31 Dec 2012 - 15:14
BSpider2,473109.68 MB31 Dec 2012 - 14:55
Nutch6,362566.10 MB31 Dec 2012 - 14:53
SeznamBot3,812247.06 MB31 Dec 2012 - 14:17
Heritrix6,1851.54 GB31 Dec 2012 - 13:11
Unknown robot (identified by 'crawl')68,0813.88 GB31 Dec 2012 - 11:53
CFNetwork4001.02 GB31 Dec 2012 - 10:46
Alexa (IA Archiver)1,238169.21 MB31 Dec 2012 - 10:43
LinkChecker5307.02 MB31 Dec 2012 - 09:56
Checkbot245031 Dec 2012 - 06:47
Phantom651.87 MB31 Dec 2012 - 01:25
MSNBot8,627170.89 MB30 Dec 2012 - 14:04
Google AdSense213880.12 MB30 Dec 2012 - 08:09
ichiro6,3671.01 GB30 Dec 2012 - 06:11
psbot7,96411.36 MB30 Dec 2012 - 04:20
LinkScan761.15 MB30 Dec 2012 - 01:28
legs47,9598.88 GB29 Dec 2012 - 07:26
Unknown robot (identified by 'checker')44187.14 MB29 Dec 2012 - 04:03
Web Core / Roots14453.29 KB28 Dec 2012 - 15:14
Feedfetcher-Google35219.66 MB28 Dec 2012 - 11:30
WWWC Ver 0.2.592028 Dec 2012 - 08:27
Y!J Yahoo Japan804128.93 MB28 Dec 2012 - 02:13
NG 1.x (Exalead)1,766158.21 MB27 Dec 2012 - 21:20
BBot617.02 MB27 Dec 2012 - 20:08
Jakarta commons-httpclient75941.04 MB27 Dec 2012 - 04:18
larbin712.83 MB26 Dec 2012 - 12:14
StackRambler21247.20 MB26 Dec 2012 - 08:32
WebFilter896.72 MB25 Dec 2012 - 23:59
Netluchs432.67 MB24 Dec 2012 - 07:05
WebCollage6167.25 KB24 Dec 2012 - 03:10
archive.org bot8,8522.28 GB21 Dec 2012 - 23:29
HTTrack off-line browser256,0478.40 GB19 Dec 2012 - 17:25
FaceBook bot89210.68 MB19 Dec 2012 - 12:03
MojeekBot183.41 MB19 Dec 2012 - 09:34
W3C Link Checker30150.80 KB17 Dec 2012 - 06:56
WebVac34,998172.17 MB13 Dec 2012 - 19:13
Ultraseek480175.38 MB12 Dec 2012 - 22:02
Steeler12848.27 KB09 Dec 2012 - 15:50
LinkBot116.14 MB06 Dec 2012 - 00:49
Unknown robot (identified by 'discovery')64,6418.02 GB04 Dec 2012 - 09:53
Holmes281.40 KB02 Dec 2012 - 14:25
NG 2.x (Exalead)7163.56 KB02 Dec 2012 - 11:11
Findlinks441.47 MB01 Dec 2012 - 23:16
JBot Java Web Robot4320.13 KB24 Nov 2012 - 04:12
Sven366.30 KB20 Nov 2012 - 03:06
HTMLParser231.25 MB17 Nov 2012 - 06:20
Ocelli135,3525.65 GB14 Nov 2012 - 21:34
Fish search8178.02 KB11 Nov 2012 - 16:25
Unknown robot (identified by 'sucker')71,2051.72 GB11 Nov 2012 - 10:02
Turn It In1184.30 MB08 Nov 2012 - 19:28
LinkWalker46.08 MB02 Nov 2012 - 15:17
Webdup141.80 KB27 Oct 2012 - 00:05
WallPaper (alias crawlpaper)118.27 KB10 Oct 2012 - 01:44
Powermarks163.00 MB08 Oct 2012 - 10:28
mnoGoSearch search engine software141.24 KB14 Aug 2012 - 21:14
InfoBot271.02 MB13 Aug 2012 - 15:56
MSIECrawler10177.68 KB09 Aug 2012 - 05:03
Feedburner1008 Aug 2012 - 01:19
bender focused_crawler5017.97 MB02 Aug 2012 - 12:42
Custo98.65 MB29 Jul 2012 - 00:07
OutfoxBot/YodaoBot13,0691.30 MB25 Jul 2012 - 20:45
W3C Validator22443.28 KB22 Jul 2012 - 10:12
FeedValidator11163.46 KB22 Jul 2012 - 10:12
Spiderman7188.37 KB20 Jul 2012 - 14:56
The World Wide Web Worm186.82 KB17 Jul 2012 - 12:19
QihooBot4178.59 KB12 Jul 2012 - 06:11
Emacs-w3 Search Engine586.93 MB07 Jun 2012 - 21:23
Unknown robot (identified by 'hunter')116.66 KB02 Jun 2012 - 04:32
Pogodak.com15.83 KB17 May 2012 - 01:54
Perman surfer343.84 KB16 May 2012 - 02:49
Aport1006 May 2012 - 18:44
Speedy Spider1,64971.94 MB06 May 2012 - 01:43
Internet Shinchakubin140.27 KB01 May 2012 - 11:25
Pioneer1781 Bytes30 Apr 2012 - 17:16
MagpieRSS116.66 KB29 Apr 2012 - 07:44
Cursor1359 Bytes28 Apr 2012 - 18:20
Raven Search1191.90 MB24 Apr 2012 - 22:01
churl439.60 KB13 Apr 2012 - 08:34
TITAN364.10 KB11 Apr 2012 - 18:01
WebReaper1,8754.10 MB02 Apr 2012 - 08:26
Seekbot139.60 KB20 Mar 2012 - 01:14
JoBo Java Web Robot2189.91 KB07 Mar 2012 - 00:28
Susie116.66 KB26 Feb 2012 - 14:39
marvin/infoseek156.06 MB25 Feb 2012 - 10:02
T-H-U-N-D-E-R-S-T-O-N-E4157.71 KB25 Feb 2012 - 02:27
GigaBot5225.58 KB13 Feb 2012 - 09:56
Ask112.27 KB07 Jan 2012 - 16:23
* Robots shown here gave hits or traffic "not viewed" by visitors, so they are not included in other charts.



Advanced Web Statistics 7.0 (build 1.971) - Created by awstats (plugins: hashfiles, decodeutfkeys, tooltips)