Compare commits

..

No commits in common. "master" and "v6.5" have entirely different histories.
master ... v6.5

44 changed files with 1119 additions and 2430 deletions

136
ChangeLog
View File

@ -1,131 +1,5 @@
Revision history for SquidAnalyzer Revision history for SquidAnalyzer
szaszg
* SquidAnalyzer.pm can read compressed 'stat_code.dat', so you can compress previous years .dat files (e.g.: find 2023 -iname '*.dat' | xargs xz -9).
* You can install SquidAnalyzer.pm into custom LIB directory, so you can keep all files together (e.g. /opt/squidanalyzer)
* Hungarian translation added (hu_HU.txt)
6.6 - Sun May 7 16:38:14 CEST 2017
This is a maintenance release that fix one year of issues reported by users.
There is also some additional features, configuration directives all listed
here:
* Add TopStorage configuration directive to limit the storage of url to
a certain quantity in data file and sorted by OrderUrl. On huge access
log it will improve a lot the performances but you will have less
precision in the top url. Default to 0, all url will be stored.
Here are the performances of SquidAnalayzer on a 1.4 GB access log
file to parse and compute full reports over one week:
UrlReport | UserReport | Duration
----------+------------+---------
0 | 0 | 2m30s
0 | 1 | 3m00s
1 | 1 | 18m15s
1 | 1 | 9m55s when TopStorage is set to 100
* Add a cache to network and user aliases for speed improvement. Thanks to
Louis-Berthier Soulliere for the report.
* Add TimeStart and TimeStop configuration directives to allow to
specify a start and stop time. Log line out of this time range
will not be parsed. The format of the value is HH:MM. These
directives can be overriden with the -s | --start and -S | --stop
command line options. Thanks to Louis-Berthier Soulliere for the
feature request.
* Add UpdateAlias configuratio directive to apply immediately the changes
made in aliases files to avoid duplicates. You still have to use
--rebuild to recreate previous reports with new aliases. Enabling
this will imply a lost of performances with huges log files.
* Add UseUrlPort configuration directive to be able to include port number
into Url statistics. Default is to remove the port information from the
Url. Thanks to Tobias Wigand for the feature request.
* Add report of top denied url on user statistic page. Thanks to delumerlino
and Pavel Podkorytov for the feature request.
* Add last visited timestamp on urls reports and show last ten visit on user
url report. The last visit are counted after 5 minutes in hour view, after
30 minutes in day views and per day in month view. Thanks to Ringa Mari
Sundberg for the feature request.
* Add support to ipv6 address dns resolving, you need perl > 5.014. Thanks
to Brian J. Murrell for the report.
Full list of other bug fixes:
- Change user top url title from "Top n/N Url" into "Top n/N sites". Thanks
to Daniel Bareiro for the report.
- Update documentation to clarify the use of space character in aliases
files. Thanks to Darren Spruell for the report.
- Fix explanation of UserAlias file format about ip address vs DNS name.
Thanks to Darren Spruell for the report.
- Fix missing report of TCP_DENIED_REPLY messages. Thanks to Jeff Gebhardt
for the report.
- Add license file about resources file and a script to retrieve original
javascript libraries.
- Fix html report building that was limited to the last day.
- Fix missing network alias replacement.
- Update year in copyrights.
- Disabled bandwidth cost report by default.
- Fix removing of obsolete year directory.
- Fix obsolete statistics no longer being deleted. Thanks to andreybrasil
for the report.
- Allow parsing of access.log generated through syslog. Thanks to Celine
Labrude for the report.
- Add Url_Hit label in translation files.
- Fix remaining _SPC_ in username. Thanks to roshanroche for the report.
- Fix remaining SA_CALENDAR_SA in html output. Thanks to roshanroche for
the report.
- Add more fix to denied stat datafile corruption. Thanks to PiK2K for the
report.
- Fix denied stat datafile corruption. Thanks to PiK2K for the report.
- Use CORE::localtime to format denied first and last hit.
- Fix potential unparsed log case when log file are set in configuration
file and not on command line.
- Change the in-line popup (on top domain and top URL) to show hits on hits
tables, bytes on the bytes tables and duration on the duration tables,
instead of count. Thanks to Wesley Bresson for the feature request.
- Only apply OrderUrl to user url list, other reports in Top domain and Top
Url are now always ordered following the first column, which is the sorted
column of the report (hits, bytes and duration).
- Fix missing limit total number of URLs shown for a user to TopNumber.
Thanks to Graham Wing for the report.
- Update statistic on users with DENIED code to have the full list of
user/ip even if they never hit an url.
- Change Perl install directory from vendor to site to avoid well know issue
on BSD. Thanks to dspruell for the report.
- Add initial Debian package build files
- Update squidanalyzer.css changed the width of the single menu tabs,
because in German language, it looks better at the tab "TOP DENIED" is in
German language "TOP VERBOTEN" and will be displayed better, no wordwrap
anymore, will be done with this change. Thanks to Klaus Tachtler for the
patch.
- Fix Throughput label for unit/s that was not dynamically changed during
value formating and always labelled as B/s. Thanks to aabaker for the
report.
- Fix typo in graph titles. Thanks to aabaker for the patch.
- Update missing fields to German language file. Thanks to Klaus Tachtler
for the patch.
- Fix top url report that was not cumulate statistics anymore. Thanks to
Wesley Bresson for the report.
- Fix typo about Network exclusion. Thanks to Mathieu Parent for the patch.
- Manpages fixes. Thanks to Mathieu Parent for the patch.
- Use FHS for manpages path. Thanks to Mathieu Parent for the patch.
- Update russian language file. Thanks to Yuri Voinov for the patch.
- Fix typo in mime type redefinition.
- Mark mime-types with invalid characters as "invalid/type". Thanks to
gitdevmod for the report.
- Add missing throughput translation entries in lang files. Thanks to Yuri
Voinov for the report.
- Fix major issue in squidguard and ubfguard history file managment. Thanks
to Guttilla Elmi for the report and the help.
- Fix path to xzcat program durinf install. Thanks to Johan Glenac for
the report.
- Fix auto detection of SquidGuard log file when there is no denied entry
in the first lines.
- Fix typo in debug messages
- Add warning when DNSLookupTimeout is reach. Thanks to gitdevmod for the
report.
6.5 - Sun Jan 3 16:12:12 CET 2016 6.5 - Sun Jan 3 16:12:12 CET 2016
This is a mantenance release to fix an overlaping bug on bytes charts with This is a mantenance release to fix an overlaping bug on bytes charts with
@ -138,7 +12,7 @@ last versions of browsers like firefox, iceweasel and chrome.
6.4 - Wed Dec 16 22:12:45 CET 2015 6.4 - Wed Dec 16 22:12:45 CET 2015
This release adds throughput statistics to all reports. It also allow one to add This release adds throughput statistics to all reports. It also allow to add
a ufdbGuard log to the list of log files and to report blocked URLs into the a ufdbGuard log to the list of log files and to report blocked URLs into the
Denied reports. It also adds support to xz compressed files. Denied reports. It also adds support to xz compressed files.
@ -185,7 +59,7 @@ It also included several bug fixes since last release.
6.3 - Mon Oct 12 07:56:29 CEST 2015 6.3 - Mon Oct 12 07:56:29 CEST 2015
This release adds a new report to show statistics about Denied URLs. It also This release adds a new report to show statistics about Denied URLs. It also
allow one to add a SquidGuard log to the list of log files and to report blocked allow to add a SquidGuard log to the list of log files and to report blocked
URLs into the Denied reports. It ialso adds a pie chart on SquidGuard ACLs use. URLs into the Denied reports. It ialso adds a pie chart on SquidGuard ACLs use.
There's also four new configuration directives: There's also four new configuration directives:
@ -386,7 +260,7 @@ Here the full list of changes:
- Little fix in a translation. Thanks to atlhon for the patch. - Little fix in a translation. Thanks to atlhon for the patch.
- Fix case where days in calendar does not appear when DateFormat was - Fix case where days in calendar does not appear when DateFormat was
changed. Thanks to joseh-henrique for the report. changed. Thanks to joseh-henrique for the report.
- Update Makefile with META_MERGE and MAN3PODS information. - Update Makefile with META_MERGE and MAN3PODS informations.
- Fix missing cleaning of pid file when early error occurs. - Fix missing cleaning of pid file when early error occurs.
- Automatically remove \r when reading configuration file. - Automatically remove \r when reading configuration file.
- Improve incremental mode by seeking directly to last position in - Improve incremental mode by seeking directly to last position in
@ -542,7 +416,7 @@ of users to show in reports.
a custom one is specified with the -c option. Thanks to Thibaud Aubert a custom one is specified with the -c option. Thanks to Thibaud Aubert
for the report. for the report.
- Add --no-year-stat to disable year statistics, reports will start from - Add --no-year-stat to disable year statistics, reports will start from
month level only. This allow one to save time during reports generation. month level only. This allow to save time during reports generation.
- Allow composed top level domain statistics in Top Domain report, like - Allow composed top level domain statistics in Top Domain report, like
co.uk. Thanks to Thibaut Aubert for the feature request. co.uk. Thanks to Thibaut Aubert for the feature request.
- Add support to CIDR notation in network-alias file. Thanks to Thibaud - Add support to CIDR notation in network-alias file. Thanks to Thibaud
@ -771,7 +645,7 @@ dir if required.
only log format supported. only log format supported.
UPGRADE: If you use network and/or user aliases, even if I try to preserved backward UPGRADE: If you use network and/or user aliases, even if I try to preserved backward
compatibility, you may want to start with new data files as these information are compatibility, you may want to start with new data files as these informations are
now replaced directly into the data file instead of the HTML files. Changes only now replaced directly into the data file instead of the HTML files. Changes only
concern file SquidAnalyzer.pm so you can just override it. There's also a new concern file SquidAnalyzer.pm so you can just override it. There's also a new
configuration directive 'AnonymizeLogin' so you may copy/paste his definition in configuration directive 'AnonymizeLogin' so you may copy/paste his definition in

126
INSTALL
View File

@ -8,7 +8,7 @@ REQUIREMENT
INSTALLATION INSTALLATION
Generic install Generic install
If you want the package to be installed into the Perl distribution just If you want the package to be intalled into the Perl distribution just
do the following: do the following:
perl Makefile.PL perl Makefile.PL
@ -40,9 +40,6 @@ INSTALLATION
as the issue is related to an install into the default Perl vendor as the issue is related to an install into the default Perl vendor
installdirs it will then use Perl site installdirs. installdirs it will then use Perl site installdirs.
Note: you may not encountered this issue any more, since v6.6
SquidAnalyzer use site as default installation directory.
Custom install Custom install
You can create your fully customized SquidAnalyzer installation by using You can create your fully customized SquidAnalyzer installation by using
the Makefile.PL Perl script. Here is a sample: the Makefile.PL Perl script. Here is a sample:
@ -53,7 +50,7 @@ INSTALLATION
CONFDIR=/etc \ CONFDIR=/etc \
HTMLDIR=/var/www/squidreport \ HTMLDIR=/var/www/squidreport \
BASEURL=/squidreport \ BASEURL=/squidreport \
MANDIR=/usr/share/man/man3 \ MANDIR=/usr/man/man3 \
DOCDIR=/usr/share/doc/squidanalyzer DOCDIR=/usr/share/doc/squidanalyzer
If you want to build a distro package, there are two other options that If you want to build a distro package, there are two other options that
@ -105,104 +102,40 @@ INSTALLATION
4. Setup a cronjob to run squid-analyzer daily or more often: 4. Setup a cronjob to run squid-analyzer daily or more often:
# SquidAnalyzer log reporting daily # SquidAnalyzer log reporting daily
0 2 * * * /usr/local/bin/squid-analyzer > /dev/null 2>&1 0 2 * * * /usr/local/bin/squid-analyzer > /dev/null 2>&1
or run it manually. For more information, see README file. or run it manually. For more information, see README file.
If your squid logfiles are rotated then cron isn't going to give the You can use network name instead of network ip addresses by using the
expected result as there exists a time between when the cron is run and network-aliases file. Also if you don't have authentication enable and
the logfiles are rotated. It would be better to call squid-analyzer from want to replace client ip addresses by some know user or computer you
logrotate, eg: can use the user-aliases file to do so.
/var/log/proxy/squid-access.log {
daily
compress
rotate 730
missingok
nocreate
sharedscripts
postrotate
test ! -e /var/run/squid.pid || /usr/sbin/squid -k rotate
/usr/bin/squid-analyzer -d -l /var/log/proxy/squid-access.log.1
endscript
}
You can also use network name instead of network ip addresses by using
the network-aliases file. Also if you don't have authentication enable
and want to replace client ip addresses by some know user or computer
you can use the user-aliases file to do so.
See the file squidanalyzer.conf to customized your output statistics and See the file squidanalyzer.conf to customized your output statistics and
match your network and file system configuration. match your network and file system configuration.
Upgrade
Upgrade to a new release or to last development code is just like
installation. To install latest development code to use latest
ehancements process as follow:
wget https://github.com/darold/squidanalyzer/archive/master.zip
unzip master.zip
cd squidanalyzer-master/
perl Makefile.PL
make
sudo make install
then to apply change to current reports you have to rebuild them using:
squid-analyser --rebuild
This command will rebuild all your reports where there is still data
files I mean not removed by the retention limit. It can takes a very
long time if you have lot of historic, in this case you may want to use
option -b or --build_date to limit the rebuild period.
USAGE USAGE
SquidAnalyzer can be run manually or by cron job using the SquidAnalyzer can be run manually or by cron job using the
squid-analyzer Perl script. Here are authorized usage: squid-analyzer Perl script. Here are authorized usage:
Usage: squid-analyzer [ -c squidanalyzer.conf ] [logfile(s)] Usage: squid-analyzer [ -c squidanalyzer.conf ] [-l logfile]
-c | --configfile filename : path to the SquidAnalyzer configuration file. -c | --configfile filename : path to the SquidAnalyzer configuration file.
By default: /etc/squidanalyzer/squidanalyzer.conf By default: /etc/squidanalyzer.conf
-b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd -b | --build_date date : set the day to be rebuilt, format: yyyy-mm-dd,
or yyyy-mm or yyyy. Used with -r or --rebuild. yyyy-mm or yyyy. Used with -r or --rebuild.
-d | --debug : show debug information. -d | --debug : show debug informations.
-h | --help : show this message and exit. -h | --help : show this message and exit.
-j | --jobs number : number of jobs to run at same time. Default -l | --logfile filename : path to the Squid logfile to parse.
is 1, run as single process. By default: /var/log/squid/access.log
-o | --outputdir name : set output directory. If it does not start -p | --preserve number : used to set the statistic obsolescence in
with / then prefixes Output from configfile number of month. Older stats will be removed.
-p | --preserve number : used to set the statistic obsolescence in -r | --rebuild : use this option to rebuild all html and graphs
number of month. Older stats will be removed. output from all data files.
-P | --pid_dir directory : set directory where pid file will be stored. -v | version : show version and exit.
Default /tmp/ --no-year-stat : disable years statistics, reports will
-r | --rebuild : use this option to rebuild all html and graphs start from month level only.
output from all data files.
-s | --start HH:MM : log lines before this time will not be parsed.
-S | --stop HH:MM : log lines after this time will not be parsed.
-t | --timezone +/-HH : set number of hours from GMT of the timezone.
Use this to adjust date/time of SquidAnalyzer
output when it is run on a different timezone
than the squid server.
-v | version : show version and exit.
--no-year-stat : disable years statistics, reports will start
from month level only.
--no-week-stat : disable weekly statistics.
--with-month-stat : enable month stats when --no-year-stat is used.
--startdate YYYYMMDDHHMMSS : lines before this datetime will not be parsed.
--stopdate YYYYMMDDHHMMSS : lines after this datetime will not be parsed.
--skip-history : used to not take care of the history file. Log
parsing offset will start at 0 but old history
file will be preserved at end. Useful if you
want to parse and old log file.
--override-history : when skip-history is used the current history
file will be overridden by the offset of the
last log file parsed.
Log files to parse can be given as command line arguments or as a comma
separated list of file for the LogFile configuration directive. By
default SquidAnalyer will use file: /var/log/squid/access.log
There is special options like --rebuild that force SquidAnalyzer to There is special options like --rebuild that force SquidAnalyzer to
rebuild all HTML reports, useful after an new feature or a bug fix. If rebuild all HTML reports, useful after an new feature or a bug fix. If
@ -220,15 +153,6 @@ USAGE
will only preserve six month of statistics from the last run of will only preserve six month of statistics from the last run of
squidanalyzer. squidanalyzer.
If you have a SquidGuard log you can add it to the list of file to be
parsed, either in the LogFile configuration directive log list, either
at command line:
squid-analyzer /var/log/squid3/access.log /var/log/squid/SquidGuard.log
SquidAnalyzer will automatically detect the log format and report
SquidGuard ACL's redirection to the Denied Urls report.
CONFIGURATION CONFIGURATION
See README file. See README file.
@ -236,7 +160,7 @@ AUTHOR
Gilles DAROLD <gilles@darold.net> Gilles DAROLD <gilles@darold.net>
COPYRIGHT COPYRIGHT
Copyright (c) 2001-2019 Gilles DAROLD Copyright (c) 2001-2016 Gilles DAROLD
This package is free software and published under the GPL v3 or above This package is free software and published under the GPL v3 or above
license. license.

View File

@ -27,5 +27,4 @@ lang/es_ES.txt
lang/ru_RU.txt lang/ru_RU.txt
lang/uk_UA.txt lang/uk_UA.txt
lang/cs_CZ.txt lang/cs_CZ.txt
lang/hu_HU.txt
META.yml Module meta-data (added by MakeMaker) META.yml Module meta-data (added by MakeMaker)

View File

@ -4,7 +4,7 @@ use ExtUtils::MakeMaker;
use strict; use strict;
my @ALLOWED_ARGS = ('LOGFILE','BINDIR','ETCDIR', 'CONFDIR','HTMLDIR','BASEURL','DOCDIR','MANDIR','QUIET','INSTALLDIRS','DESTDIR', 'LIB'); my @ALLOWED_ARGS = ('LOGFILE','BINDIR','ETCDIR', 'CONFDIR','HTMLDIR','BASEURL','DOCDIR','MANDIR','QUIET','INSTALLDIRS','DESTDIR');
# Parse command line arguments and store them as environment variables # Parse command line arguments and store them as environment variables
while ($_ = shift) { while ($_ = shift) {
@ -31,8 +31,7 @@ my $BASEURL = $ENV{BASEURL} || '/squidreport';
my $DOCDIR = $ENV{DOCDIR} || ''; my $DOCDIR = $ENV{DOCDIR} || '';
my $MANDIR = $ENV{MANDIR} || '/usr/local/man/man3'; my $MANDIR = $ENV{MANDIR} || '/usr/local/man/man3';
my $DESTDIR = $ENV{DESTDIR} || ''; my $DESTDIR = $ENV{DESTDIR} || '';
my $INSTALLDIRS = $ENV{INSTALLDIRS} ||= 'site'; $ENV{INSTALLDIRS} ||= 'vendor';
my $LIB = $ENV{LIB} || '';
unless(open(INST, ">install_all.sh")) { unless(open(INST, ">install_all.sh")) {
print "\nError: can't write post install file install_all.sh, $!\n"; print "\nError: can't write post install file install_all.sh, $!\n";
@ -56,7 +55,7 @@ print INST qq{
test ! -d "$DESTDIR$MANDIR" && mkdir -p $DESTDIR$MANDIR test ! -d "$DESTDIR$MANDIR" && mkdir -p $DESTDIR$MANDIR
# Copy files that must not be overriden # Copy files that must not be overriden
for file in squidanalyzer.conf network-aliases user-aliases url-aliases excluded included; do for file in squidanalyzer.conf network-aliases user-aliases excluded included; do
if [ -r $DESTDIR$ETCDIR/\$file ]; then if [ -r $DESTDIR$ETCDIR/\$file ]; then
install -m 644 etc/\$file $DESTDIR$ETCDIR/\$file.sample install -m 644 etc/\$file $DESTDIR$ETCDIR/\$file.sample
else else
@ -131,8 +130,6 @@ close(INST);
`perl -p -i -e 's#^Exclude.*#Exclude $ETCDIR/excluded#' etc/squidanalyzer.conf`; `perl -p -i -e 's#^Exclude.*#Exclude $ETCDIR/excluded#' etc/squidanalyzer.conf`;
`perl -p -i -e 's#^Include.*#Include $ETCDIR/included#' etc/squidanalyzer.conf`; `perl -p -i -e 's#^Include.*#Include $ETCDIR/included#' etc/squidanalyzer.conf`;
`perl -p -i -e 's#Lang.*\.txt#Lang $ETCDIR/lang/en_US.txt#' etc/squidanalyzer.conf`; `perl -p -i -e 's#Lang.*\.txt#Lang $ETCDIR/lang/en_US.txt#' etc/squidanalyzer.conf`;
`perl -p -i -e 's|^use lib .*|#use lib "PERL5LIB"|' squid-analyzer`;
`perl -p -i -e 's|^\#use lib .*|use lib "$LIB"|' squid-analyzer` if $LIB ne '';
my $zcat = `which zcat`; my $zcat = `which zcat`;
chomp($zcat); chomp($zcat);
@ -142,11 +139,6 @@ my $bzcat = `which bzcat`;
chomp($bzcat); chomp($bzcat);
`perl -p -i -e 's#^\\\$BZCAT_PROG.*#\\\$BZCAT_PROG = "$bzcat";#' SquidAnalyzer.pm`; `perl -p -i -e 's#^\\\$BZCAT_PROG.*#\\\$BZCAT_PROG = "$bzcat";#' SquidAnalyzer.pm`;
my $xzcat = `which xzcat`;
chomp($xzcat);
`perl -p -i -e 's#^\\\$XZCAT_PROG.*#\\\$XZCAT_PROG = "$xzcat";#' SquidAnalyzer.pm`;
WriteMakefile( WriteMakefile(
'DISTNAME' => 'SquidAnalyzer', 'DISTNAME' => 'SquidAnalyzer',
'NAME' => 'SquidAnalyzer', 'NAME' => 'SquidAnalyzer',
@ -158,10 +150,9 @@ WriteMakefile(
'AUTHOR' => 'Gilles Darold (gilles@darold.net)', 'AUTHOR' => 'Gilles Darold (gilles@darold.net)',
'ABSTRACT' => 'Squid log analyzer', 'ABSTRACT' => 'Squid log analyzer',
'EXE_FILES' => [ qw(squid-analyzer) ], 'EXE_FILES' => [ qw(squid-analyzer) ],
'MAN3PODS' => { 'doc/SquidAnalyzer.pod' => 'blib/man3/SquidAnalyzer.3pm' }, 'MAN3PODS' => { 'doc/SquidAnalyzer.pod' => 'blib/man3/SquidAnalyzer.3' },
'DESTDIR' => $DESTDIR, 'DESTDIR' => $ENV{DESTDIR},
'INSTALLDIRS' => $INSTALLDIRS, 'INSTALLDIRS' => $ENV{INSTALLDIRS},
'LIB' => $LIB,
'clean' => { FILES => "install_all.sh lib/blib/ squid-analyzer.3" }, 'clean' => { FILES => "install_all.sh lib/blib/ squid-analyzer.3" },
'META_MERGE' => { 'META_MERGE' => {
resources => { resources => {

225
README
View File

@ -20,16 +20,9 @@ REQUIREMENT
are based on the Flotr2 Javascript library so they are drawn at your are based on the Flotr2 Javascript library so they are drawn at your
browser side without extra installation required. browser side without extra installation required.
CHANGES from https://github.com/darold/squidanalyzer
SquidAnalyzer.pm can read compressed 'stat_code.dat', so you can compress
previous years .dat files (e.g.: find 2023 -iname '*.dat' | xargs xz -9).
You can install SquidAnalyzer.pm into custom LIB directory, so you can
keep all files together (e.g. /opt/squidanalyzer)
Hungarian translation
INSTALLATION INSTALLATION
Generic install Generic install
If you want the package to be installed into the Perl distribution just If you want the package to be intalled into the Perl distribution just
do the following: do the following:
perl Makefile.PL perl Makefile.PL
@ -61,9 +54,6 @@ INSTALLATION
as the issue is related to an install into the default Perl vendor as the issue is related to an install into the default Perl vendor
installdirs it will then use Perl site installdirs. installdirs it will then use Perl site installdirs.
Note: you may not encountered this issue any more, since v6.6
SquidAnalyzer use site as default installation directory.
Custom install Custom install
You can create your fully customized SquidAnalyzer installation by using You can create your fully customized SquidAnalyzer installation by using
the Makefile.PL Perl script. Here is a sample: the Makefile.PL Perl script. Here is a sample:
@ -74,28 +64,16 @@ INSTALLATION
CONFDIR=/etc \ CONFDIR=/etc \
HTMLDIR=/var/www/squidreport \ HTMLDIR=/var/www/squidreport \
BASEURL=/squidreport \ BASEURL=/squidreport \
MANDIR=/usr/share/man/man3 \ MANDIR=/usr/man/man3 \
DOCDIR=/usr/share/doc/squidanalyzer DOCDIR=/usr/share/doc/squidanalyzer
Or you can install everything into one directory (e.g.: /opt/squidanalyzer)
perl Makefile.PL \
LOGFILE=/var/log/squid/access.log \
BINDIR=/opt/squidanalyzer/bin \
CONFDIR=/opt/squidanalyzer/etc \
HTMLDIR=/var/www/squidreport \
BASEURL=/squidreport \
MANDIR=/opt/squidanalyzer/share/man/man3 \
DOCDIR=/opt/squidanalyzer/share/doc/squidanalyzer
LIB=/opt/squidanalyzer/lib
If you want to build a distro package, there are two other options that If you want to build a distro package, there are two other options that
you may use. The QUIET option is to tell to Makefile.PL to not show the you may use. The QUIET option is to tell to Makefile.PL to not show the
default post install README. The DESTDIR is to create and install all default post install README. The DESTDIR is to create and install all
files in a package build base directory. For example for Fedora RPM, files in a package build base directory. For example for Fedora RPM,
thing may look like that: thing may look like that:
# Make Perl and SquidAnalyzer distrib files # Make Perl and SendmailAnalyzer distrib files
%{__perl} Makefile.PL \ %{__perl} Makefile.PL \
INSTALLDIRS=vendor \ INSTALLDIRS=vendor \
QUIET=1 \ QUIET=1 \
@ -146,32 +124,21 @@ INSTALLATION
If your squid logfiles are rotated then cron isn't going to give the If your squid logfiles are rotated then cron isn't going to give the
expected result as there exists a time between when the cron is run and expected result as there exists a time between when the cron is run and
the logfiles are rotated. It would be better to call squid-analyzer from the logfiles are rotated. It would be better to call squid-analyzer from
logrotate, create file /etc/logrotate.d/squid with the following logrotate, eg:
content:
/var/log/squid/*.log { /var/log/proxy/squid-access.log {
daily daily
compress compress
delaycompress rotate 730
rotate 5 missingok
missingok nocreate
nocreate sharedscripts
sharedscripts postrotate
postrotate test ! -e /var/run/squid.pid || /usr/sbin/squid -k rotate
test ! -e /var/run/squid.pid || test ! -x /usr/sbin/squid || /usr/sbin/squid -k rotate /usr/bin/squid-analyzer -d -l /var/log/proxy/squid-access.log.1
/usr/local/bin/squid-analyzer -d -l /var/log/squid/access.log.1 endscript
endscript
} }
Be sure that the paths used in this script correspond to your system.
5. Adjust the configuration
Make sure that the LogFile path is correct in your squidanalyzer.conf
file. For instance:
LogFile /var/log/squid/access.log
You can also use network name instead of network ip addresses by using You can also use network name instead of network ip addresses by using
the network-aliases file. Also if you don't have authentication enable the network-aliases file. Also if you don't have authentication enable
and want to replace client ip addresses by some know user or computer and want to replace client ip addresses by some know user or computer
@ -180,27 +147,6 @@ INSTALLATION
See the file squidanalyzer.conf to customized your output statistics and See the file squidanalyzer.conf to customized your output statistics and
match your network and file system configuration. match your network and file system configuration.
Upgrade
Upgrade to a new release or to last development code is just like
installation. To install latest development code to use latest
ehancements process as follow:
wget https://github.com/darold/squidanalyzer/archive/master.zip
unzip master.zip
cd squidanalyzer-master/
perl Makefile.PL
make
sudo make install
then to apply change to current reports you have to rebuild them using:
squid-analyser --rebuild
This command will rebuild all your reports where there is still data
files I mean not removed by the retention limit. It can takes a very
long time if you have lot of historic, in this case you may want to use
option -b or --build_date to limit the rebuild period.
USAGE USAGE
SquidAnalyzer can be run manually or by cron job using the SquidAnalyzer can be run manually or by cron job using the
squid-analyzer Perl script. Here are authorized usage: squid-analyzer Perl script. Here are authorized usage:
@ -211,23 +157,17 @@ USAGE
By default: /etc/squidanalyzer/squidanalyzer.conf By default: /etc/squidanalyzer/squidanalyzer.conf
-b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd -b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd
or yyyy-mm or yyyy. Used with -r or --rebuild. or yyyy-mm or yyyy. Used with -r or --rebuild.
-d | --debug : show debug information. -d | --debug : show debug informations.
-h | --help : show this message and exit. -h | --help : show this message and exit.
-j | --jobs number : number of jobs to run at same time. Default -j | --jobs number : number of jobs to run at same time. Default is 1,
is 1, run as single process. run as single process.
-o | --outputdir name : set output directory. If it does not start
with / then prefixes Output from configfile
-p | --preserve number : used to set the statistic obsolescence in -p | --preserve number : used to set the statistic obsolescence in
number of month. Older stats will be removed. number of month. Older stats will be removed.
-P | --pid_dir directory : set directory where pid file will be stored. -P | --pid_dir directory : set directory where pid file will be stored.
Default /tmp/ Default /tmp/
-r | --rebuild : use this option to rebuild all html and graphs -r | --rebuild : use this option to rebuild all html and graphs
output from all data files. output from all data files.
-R | --refresh minutes : add a html refresh tag into index.html file -t, --timezone +/-HH : set number of hours from GMT of the timezone.
with a refresh intervalle in minutes.
-s | --start HH:MM : log lines before this time will not be parsed.
-S | --stop HH:MM : log lines after this time will not be parsed.
-t | --timezone +/-HH : set number of hours from GMT of the timezone.
Use this to adjust date/time of SquidAnalyzer Use this to adjust date/time of SquidAnalyzer
output when it is run on a different timezone output when it is run on a different timezone
than the squid server. than the squid server.
@ -235,16 +175,6 @@ USAGE
--no-year-stat : disable years statistics, reports will start --no-year-stat : disable years statistics, reports will start
from month level only. from month level only.
--no-week-stat : disable weekly statistics. --no-week-stat : disable weekly statistics.
--with-month-stat : enable month stats when --no-year-stat is used.
--startdate YYYYMMDDHHMMSS : lines before this datetime will not be parsed.
--stopdate YYYYMMDDHHMMSS : lines after this datetime will not be parsed.
--skip-history : used to not take care of the history file. Log
parsing offset will start at 0 but old history
file will be preserved at end. Useful if you
want to parse and old log file.
--override-history : when skip-history is used the current history
file will be overridden by the offset of the
last log file parsed.
Log files to parse can be given as command line arguments or as a comma Log files to parse can be given as command line arguments or as a comma
separated list of file for the LogFile configuration directive. By separated list of file for the LogFile configuration directive. By
@ -338,9 +268,9 @@ CONFIGURATION
LogFile squid_access_log_file LogFile squid_access_log_file
Set the path to the Squid log file. This can be a comma separated Set the path to the Squid log file. This can be a comma separated
list of files to process several files at the same time. If the list of files to process several files at the same time. If the
files comes from different Squid servers, they will be merged in a files comes from differents Squid servers, they will be merges in a
single reports. You can also add to the list a SquidGuard log file, single reports. You can also add to the list a SquidGuard log file,
SquidAnalyzer will automatically detect the format. SquidAnalyzer will atomatically detect the format.
UseClientDNSName 0 UseClientDNSName 0
If you want to use DNS name instead of client Ip address as username If you want to use DNS name instead of client Ip address as username
@ -349,15 +279,11 @@ CONFIGURATION
DNS name instead. Note that you must have a working DNS resolution DNS name instead. Note that you must have a working DNS resolution
and that it can really slow down the generation of reports. and that it can really slow down the generation of reports.
DNSLookupTimeout 100 DNSLookupTimeout 0.0001
If you have enabled UseClientDNSName and have lot of ip addresses If you have enabled UseClientDNSName and have lot of ip addresses
that do not resolve you may want to increase the DNS lookup timeout. that do not resolve you may want to increase the DNS lookup timeout.
By default SquidAnalyzer will stop to lookup a DNS name after 100 By default SquidAnalyzer will stop to lookup a DNS name after 0.0001
ms. The value must be set in millisecond. second (100 ms).
StoreUserIp 0
Store and show user different ip addresses used along the time in
user statistics. Default: no extra storage
NetworkAlias network-aliases_file NetworkAlias network-aliases_file
Set path to the file containing network alias name. Network are show Set path to the file containing network alias name. Network are show
@ -366,8 +292,7 @@ CONFIGURATION
LOCATION_NAME IP_NETWORK_ADDRESS LOCATION_NAME IP_NETWORK_ADDRESS
Separator must be a tabulation this allow the use of space character Separator must be a tabulation.
in the network alias name.
You can use regex to match and group some network addresses. See You can use regex to match and group some network addresses. See
network-aliases file for examples. network-aliases file for examples.
@ -380,29 +305,18 @@ CONFIGURATION
FULL_USERNAME IP_ADDRESS FULL_USERNAME IP_ADDRESS
When 'UseClientDNSName' is enabled you can replace ip address by a
DNS name.
If you have auth_proxy enable but want to replace login name by full If you have auth_proxy enable but want to replace login name by full
user name for example, create a file with this format: user name for example, create a file with this format:
FULL_USERNAME LOGIN_NAME FULL_USERNAME LOGIN_NAME
Separator for both must be a tabulation this allow the use of space Separator for both must be a tabulation.
character in the user alias name.
You can use regex to match and group some user login or ip You can use regex to match and group some user login or ip
addresses. See user-aliases file for examples. addresses. See user-aliases file for examples.
UrlAlias url-aliases_file You can also replace default ip address by his DNS name by enabling
Set path to the file containing url alias name. You may want to directive 'UseClientDNSName'.
group URL under a single alias to agregate statistics, in this case
create a file with this format :
URL_ALIAS URL_REGEXP1,URL_REGEXP2,...
Separator must be a tabulation. See network-aliases file for
examples.
AnonymizeLogin 0 AnonymizeLogin 0
Set this to 1 if you want to anonymize all user login. The username Set this to 1 if you want to anonymize all user login. The username
@ -424,7 +338,7 @@ CONFIGURATION
UrlReport 0|1 UrlReport 0|1
Should SquidAnalyzer display user url details. This will show all Should SquidAnalyzer display user url details. This will show all
URL read by user. Take care to have enough space disk for large URL read by user. Take care to have enougth space disk for large
user. Default is 0, no url detail report. user. Default is 0, no url detail report.
UserReport 0|1 UserReport 0|1
@ -444,29 +358,19 @@ CONFIGURATION
Default is 0, verbose mode. Default is 0, verbose mode.
CostPrice price/Mb CostPrice price/Mb
Used to set a cost of the bandwidth per Mb. If you want to generate Used to set a cost of the bandwith per Mb. If you want to generate
invoice per Mb for bandwidth traffic this can help you. Value 0 mean invoice per Mb for bandwith traffic this can help you. Value 0 mean
no cost, this is the default value, the "Cost" column is not no cost, this is the default value, the "Cost" column is not
displayed displayed
Currency currency_abbreviation Currency currency_abreviation
Used to set the currency of the bandwidth cost. Preferably the html Used to set the currency of the bandwith cost. Preferably the html
special character. Default is &euro; special character. Default is &euro;
TopNumber number TopNumber number
Used to set the number of top url and second level domain to show. Used to set the number of top url and second level domain to show.
Default is top 100. Default is top 100.
TopDenied number
Used to set the number of top denied url to show. Default is top
100.
TopStorage number
Top number of url to preserve on each data file sorted by OrderUrl.
On huge access log it will improve a lot the performances but you
will have less precision in the top url. Default to 0, all url will
be stored.
TopUrlUser Use this directive to show the top N users that look at an TopUrlUser Use this directive to show the top N users that look at an
URL or a domain. Set it to 0 to disable this feature. Default is top 10. URL or a domain. Set it to 0 to disable this feature. Default is top 10.
Exclude exclusion_file Exclude exclusion_file
@ -480,7 +384,7 @@ CONFIGURATION
You can also use the NETWORK type to define network address with You can also use the NETWORK type to define network address with
netmask using the CIDR notation: xxx.xxx.xxx.xxx/n netmask using the CIDR notation: xxx.xxx.xxx.xxx/n
See example below: See example bellow:
NETWORK 192.168.1.0/24 10.10.0.0/16 NETWORK 192.168.1.0/24 10.10.0.0/16
CLIENT 192\.168\.1\.2 CLIENT 192\.168\.1\.2
@ -504,7 +408,7 @@ CONFIGURATION
You can also use the NETWORK type to define network address with You can also use the NETWORK type to define network address with
netmask using the CIDR notation: xxx.xxx.xxx.xxx/n netmask using the CIDR notation: xxx.xxx.xxx.xxx/n
See example below: See example bellow:
NETWORK 192.168.1.0/24 10.10.0.0/16 NETWORK 192.168.1.0/24 10.10.0.0/16
CLIENT 192\.168\.1\.2 CLIENT 192\.168\.1\.2
@ -551,7 +455,7 @@ CONFIGURATION
report peer cache hit onto your stats. report peer cache hit onto your stats.
TransfertUnit TransfertUnit
Allow one to change the default unit used to display transfert size. Allow to change the default unit used to display transfert size.
Default is BYTES, other possible values are KB, MB and GB. Default is BYTES, other possible values are KB, MB and GB.
MinPie MinPie
@ -569,58 +473,25 @@ CONFIGURATION
MaxFormatError MaxFormatError
When SquidAnalyzer find a corrupted line in his data file, it exit When SquidAnalyzer find a corrupted line in his data file, it exit
immediately. You can force him to wait for a certain amount of immedialtly. You can force him to wait for a certain amount of
errors before exiting. Of course you might want to remove the errors before exiting. Of course you might want to remove the
corrupted line before the next run. This can be useful if you have corrupted line before the next run. This can be useful if you have
special characters in some fields like mime type. special characters in some fields like mime type.
TimeZone TimeZone
# Adjust timezone to use when SquidAnalyzer reports different time Set timezone to use when SquidAnalyzer is used in a different server
than graphs # in your browser. The value must follow format: +/-HH. than the one running squid and there is a different timezone between
Default is to use local # time. This must be considered as real these two machines. The value must follow format: +/-HH. Default is
timezone but the number of hours to add # or remove from log to use local time. For example:
timestamp. The value can be set to: auto, in this case #
SquidAnalyzer will autodetect the timezone and apply it.
Adjust timezone to use when SquidAnalyzer reports different time TimeZone +01
than graphs in your browser. The value must follow format: +/-HH.
Default is to use local time. This must be considered as real
timezone but the number of hours to add remove from log timestamp to
adjust graphs times. For example:
TimeZone +01 for a log file generated on zone Europe/Paris with UTC+0100 and
parsed on a computer with different timezone.
will append one hour to all timestamp.
Additionaly TimeZone can be set to auto:
TimeZone auto
to let SquidAnalyzer auto detect the timezone to use.
UseUrlPort
Enable this directive if you want to include port number into Url
statistics. Default is to remove the port information from the Url.
UpdateAlias
Enable this directive if you want to apply immediately the changes
made in aliases files to avoid duplicates. You still have to use
--rebuild to recreate previous reports with new aliases. Enabling
this will imply a lost of performances with huges log files.
TimeStart and TimeStop
The two following configuration directive allow you to specify a
start and stop time. Log line out of this time range will not be
parsed. The format of the value is HH:MM
RefreshTime
Insert a html Refresh tag into all index.html files. The value is
the refresh intervalle in minutes. Default to 5 minutes. Can be
overridden at command line with option -R | --refresh
SUPPORT SUPPORT
Release announcement Release annoucement
Please follow us on twitter to receive release announcement and latest Please follow us on twitter to receive release annoucement and latest
news : https://twitter.com/SquidAnalyzer news : https://twitter.com/SquidAnalyzer
Bugs and Feature requests Bugs and Feature requests
@ -632,7 +503,7 @@ SUPPORT
send me your ideas, features request or patches using the tools on the send me your ideas, features request or patches using the tools on the
git repository at https://github.com/darold/squidanalyzer git repository at https://github.com/darold/squidanalyzer
You can also support the developer by donate some contribution by You can also support the developper by donate some contribution by
clicking on the "Donate" button on the SquidAnalyzer web site at clicking on the "Donate" button on the SquidAnalyzer web site at
http://squidanalyzer.darold.net/ http://squidanalyzer.darold.net/
@ -640,7 +511,7 @@ AUTHOR
Gilles DAROLD <gilles@darold.net> Gilles DAROLD <gilles@darold.net>
COPYRIGHT COPYRIGHT
Copyright (c) 2001-2019 Gilles DAROLD Copyright (c) 2001-2016 Gilles DAROLD
This package is free software and published under the GPL v3 or above This package is free software and published under the GPL v3 or above
license. license.

File diff suppressed because it is too large Load Diff

4
debian/.gitignore vendored
View File

@ -1,4 +0,0 @@
squidanalyzer
*.debhelper*
squidanalyzer.substvars
files

5
debian/changelog vendored
View File

@ -1,5 +0,0 @@
squidanalyzer (6.6-1) experimental; urgency=low
- Initial Debian package release
-- Benjamin Renard <brenard@easter-eggs.com> Thu, 11 Aug 2016 12:09:18 +0200

1
debian/compat vendored
View File

@ -1 +0,0 @@
7

19
debian/control vendored
View File

@ -1,19 +0,0 @@
Source: squidanalyzer
Section: admin
Priority: extra
Build-Depends: debhelper (>=7), apache2-dev
Maintainer: Gilles Darold <gilles@darold.net>
Package: squidanalyzer
Architecture: all
Description: Squid proxy log analyzer and report generator
Squid proxy native log analyzer and reports generator with full
statistics about times, hits, bytes, users, networks, top URLs and
top domains. Statistic reports are oriented toward user and
bandwidth control; this is not a pure cache statistics generator.
.
SquidAnalyzer uses flat files to store data and doesn't need any SQL,
SQL Lite or Berkeley databases.
.
This log analyzer is incremental and should be run in a daily cron,
or more often with heavy proxy usage.

4
debian/copyright vendored
View File

@ -1,4 +0,0 @@
Copyright (c) 2001-2019 Gilles DAROLD
This package is free software and published under the GPL v3 or above
license.

16
debian/rules vendored
View File

@ -1,16 +0,0 @@
#!/usr/bin/make -f
%:
dh $@ --with apache2
override_dh_auto_configure:
perl Makefile.PL \
INSTALLDIRS=vendor \
LOGFILE=/var/log/squid3/access.log \
BINDIR=/usr/bin \
CONFDIR=/etc/squidanalyzer \
HTMLDIR=/var/lib/squidanalyzer \
BASEURL=/squidreport \
MANDIR=/usr/share/man/man3 \
DOCDIR=/usr/share/doc/squidanalyzer \
DESTDIR=$(CURDIR)/debian/squidanalyzer

View File

@ -1 +0,0 @@
conf debian/squidanalyzer.conf

View File

@ -1,6 +0,0 @@
Alias /squidreport /var/lib/squidanalyzer
<Directory /var/lib/squidanalyzer>
Options -Indexes +FollowSymLinks +MultiViews
AllowOverride None
Require ip 127.0.0.1
</Directory>

View File

@ -1,2 +0,0 @@
#!/bin/sh
/usr/bin/squid-analyzer

View File

@ -30,7 +30,7 @@ browser side without extra installation required.
=head2 Generic install =head2 Generic install
If you want the package to be installed into the Perl distribution just If you want the package to be intalled into the Perl distribution just
do the following: do the following:
perl Makefile.PL perl Makefile.PL
@ -59,9 +59,6 @@ please proceed as follow:
as the issue is related to an install into the default Perl vendor installdirs as the issue is related to an install into the default Perl vendor installdirs
it will then use Perl site installdirs. it will then use Perl site installdirs.
Note: you may not encountered this issue any more, since v6.6 SquidAnalyzer use
site as default installation directory.
=head2 Custom install =head2 Custom install
You can create your fully customized SquidAnalyzer installation by using the You can create your fully customized SquidAnalyzer installation by using the
@ -73,21 +70,9 @@ Makefile.PL Perl script. Here is a sample:
CONFDIR=/etc \ CONFDIR=/etc \
HTMLDIR=/var/www/squidreport \ HTMLDIR=/var/www/squidreport \
BASEURL=/squidreport \ BASEURL=/squidreport \
MANDIR=/usr/share/man/man3 \ MANDIR=/usr/man/man3 \
DOCDIR=/usr/share/doc/squidanalyzer DOCDIR=/usr/share/doc/squidanalyzer
Or you can install everything into one directory (e.g.: /opt/squidanalyzer)
perl Makefile.PL \
LOGFILE=/var/log/squid/access.log \
BINDIR=/opt/squidanalyzer/bin \
CONFDIR=/opt/squidanalyzer/etc \
HTMLDIR=/var/www/squidreport \
BASEURL=/squidreport \
MANDIR=/opt/squidanalyzer/share/man/man3 \
DOCDIR=/opt/squidanalyzer/share/doc/squidanalyzer
LIB=/opt/squidanalyzer/lib
If you want to build a distro package, there are two other options that you may use. The QUIET option is to tell to Makefile.PL to not show the default post install README. The DESTDIR is to create and install all files in a package build base directory. For example for Fedora RPM, thing may look like that: If you want to build a distro package, there are two other options that you may use. The QUIET option is to tell to Makefile.PL to not show the default post install README. The DESTDIR is to create and install all files in a package build base directory. For example for Fedora RPM, thing may look like that:
# Make Perl and SendmailAnalyzer distrib files # Make Perl and SendmailAnalyzer distrib files
@ -142,32 +127,21 @@ or run it manually. For more information, see README file.
If your squid logfiles are rotated then cron isn't going to give the expected If your squid logfiles are rotated then cron isn't going to give the expected
result as there exists a time between when the cron is run and the logfiles result as there exists a time between when the cron is run and the logfiles
are rotated. It would be better to call squid-analyzer from logrotate, create are rotated. It would be better to call squid-analyzer from logrotate, eg:
file /etc/logrotate.d/squid with the following content:
/var/log/squid/*.log { /var/log/proxy/squid-access.log {
daily daily
compress compress
delaycompress rotate 730
rotate 5 missingok
missingok nocreate
nocreate sharedscripts
sharedscripts postrotate
postrotate test ! -e /var/run/squid.pid || /usr/sbin/squid -k rotate
test ! -e /var/run/squid.pid || test ! -x /usr/sbin/squid || /usr/sbin/squid -k rotate /usr/bin/squid-analyzer -d -l /var/log/proxy/squid-access.log.1
/usr/local/bin/squid-analyzer -d -l /var/log/squid/access.log.1 endscript
endscript
} }
Be sure that the paths used in this script correspond to your system.
5. Adjust the configuration
Make sure that the LogFile path is correct in your squidanalyzer.conf file.
For instance:
LogFile /var/log/squid/access.log
You can also use network name instead of network ip addresses by using the You can also use network name instead of network ip addresses by using the
network-aliases file. Also if you don't have authentication enable and network-aliases file. Also if you don't have authentication enable and
want to replace client ip addresses by some know user or computer you want to replace client ip addresses by some know user or computer you
@ -176,27 +150,6 @@ can use the user-aliases file to do so.
See the file squidanalyzer.conf to customized your output statistics and See the file squidanalyzer.conf to customized your output statistics and
match your network and file system configuration. match your network and file system configuration.
=head2 Upgrade
Upgrade to a new release or to last development code is just like installation.
To install latest development code to use latest ehancements process as follow:
wget https://github.com/darold/squidanalyzer/archive/master.zip
unzip master.zip
cd squidanalyzer-master/
perl Makefile.PL
make
sudo make install
then to apply change to current reports you have to rebuild them using:
squid-analyser --rebuild
This command will rebuild all your reports where there is still data
files I mean not removed by the retention limit. It can takes a very
long time if you have lot of historic, in this case you may want to
use option -b or --build_date to limit the rebuild period.
=head1 USAGE =head1 USAGE
SquidAnalyzer can be run manually or by cron job using the squid-analyzer Perl SquidAnalyzer can be run manually or by cron job using the squid-analyzer Perl
@ -208,23 +161,17 @@ Usage: squid-analyzer [ -c squidanalyzer.conf ] [logfile(s)]
By default: /etc/squidanalyzer/squidanalyzer.conf By default: /etc/squidanalyzer/squidanalyzer.conf
-b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd -b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd
or yyyy-mm or yyyy. Used with -r or --rebuild. or yyyy-mm or yyyy. Used with -r or --rebuild.
-d | --debug : show debug information. -d | --debug : show debug informations.
-h | --help : show this message and exit. -h | --help : show this message and exit.
-j | --jobs number : number of jobs to run at same time. Default -j | --jobs number : number of jobs to run at same time. Default is 1,
is 1, run as single process. run as single process.
-o | --outputdir name : set output directory. If it does not start
with / then prefixes Output from configfile
-p | --preserve number : used to set the statistic obsolescence in -p | --preserve number : used to set the statistic obsolescence in
number of month. Older stats will be removed. number of month. Older stats will be removed.
-P | --pid_dir directory : set directory where pid file will be stored. -P | --pid_dir directory : set directory where pid file will be stored.
Default /tmp/ Default /tmp/
-r | --rebuild : use this option to rebuild all html and graphs -r | --rebuild : use this option to rebuild all html and graphs
output from all data files. output from all data files.
-R | --refresh minutes : add a html refresh tag into index.html file -t, --timezone +/-HH : set number of hours from GMT of the timezone.
with a refresh intervalle in minutes.
-s | --start HH:MM : log lines before this time will not be parsed.
-S | --stop HH:MM : log lines after this time will not be parsed.
-t | --timezone +/-HH : set number of hours from GMT of the timezone.
Use this to adjust date/time of SquidAnalyzer Use this to adjust date/time of SquidAnalyzer
output when it is run on a different timezone output when it is run on a different timezone
than the squid server. than the squid server.
@ -232,16 +179,6 @@ Usage: squid-analyzer [ -c squidanalyzer.conf ] [logfile(s)]
--no-year-stat : disable years statistics, reports will start --no-year-stat : disable years statistics, reports will start
from month level only. from month level only.
--no-week-stat : disable weekly statistics. --no-week-stat : disable weekly statistics.
--with-month-stat : enable month stats when --no-year-stat is used.
--startdate YYYYMMDDHHMMSS : lines before this datetime will not be parsed.
--stopdate YYYYMMDDHHMMSS : lines after this datetime will not be parsed.
--skip-history : used to not take care of the history file. Log
parsing offset will start at 0 but old history
file will be preserved at end. Useful if you
want to parse and old log file.
--override-history : when skip-history is used the current history
file will be overridden by the offset of the
last log file parsed.
Log files to parse can be given as command line arguments or as a comma separated Log files to parse can be given as command line arguments or as a comma separated
list of file for the LogFile configuration directive. By default SquidAnalyer will list of file for the LogFile configuration directive. By default SquidAnalyer will
@ -340,9 +277,9 @@ For example:
=item LogFile squid_access_log_file =item LogFile squid_access_log_file
Set the path to the Squid log file. This can be a comma separated list of files Set the path to the Squid log file. This can be a comma separated list of files
to process several files at the same time. If the files comes from different to process several files at the same time. If the files comes from differents
Squid servers, they will be merged in a single reports. You can also add to the Squid servers, they will be merges in a single reports. You can also add to the
list a SquidGuard log file, SquidAnalyzer will automatically detect the format. list a SquidGuard log file, SquidAnalyzer will atomatically detect the format.
=item UseClientDNSName 0 =item UseClientDNSName 0
@ -352,17 +289,11 @@ the client ip address, this allow you to use the DNS name instead.
Note that you must have a working DNS resolution and that it can really slow Note that you must have a working DNS resolution and that it can really slow
down the generation of reports. down the generation of reports.
=item DNSLookupTimeout 100 =item DNSLookupTimeout 0.0001
If you have enabled UseClientDNSName and have lot of ip addresses that do not If you have enabled UseClientDNSName and have lot of ip addresses that do not
resolve you may want to increase the DNS lookup timeout. By default resolve you may want to increase the DNS lookup timeout. By default SquidAnalyzer
SquidAnalyzer will stop to lookup a DNS name after 100 ms. The value must will stop to lookup a DNS name after 0.0001 second (100 ms).
be set in millisecond.
=item StoreUserIp 0
Store and show user different ip addresses used along the time in user
statistics. Default: no extra storage
=item NetworkAlias network-aliases_file =item NetworkAlias network-aliases_file
@ -372,8 +303,7 @@ create a file with this format:
LOCATION_NAME IP_NETWORK_ADDRESS LOCATION_NAME IP_NETWORK_ADDRESS
Separator must be a tabulation this allow the use of space character Separator must be a tabulation.
in the network alias name.
You can use regex to match and group some network addresses. See You can use regex to match and group some network addresses. See
network-aliases file for examples. network-aliases file for examples.
@ -386,28 +316,18 @@ show username or computer name instead, create a file with this format:
FULL_USERNAME IP_ADDRESS FULL_USERNAME IP_ADDRESS
When 'UseClientDNSName' is enabled you can replace ip address by a DNS name.
If you have auth_proxy enable but want to replace login name by full If you have auth_proxy enable but want to replace login name by full
user name for example, create a file with this format: user name for example, create a file with this format:
FULL_USERNAME LOGIN_NAME FULL_USERNAME LOGIN_NAME
Separator for both must be a tabulation this allow the use of space character Separator for both must be a tabulation.
in the user alias name.
You can use regex to match and group some user login or ip addresses. See You can use regex to match and group some user login or ip addresses. See
user-aliases file for examples. user-aliases file for examples.
=item UrlAlias url-aliases_file You can also replace default ip address by his DNS name by enabling
directive 'UseClientDNSName'.
Set path to the file containing url alias name. You may want to group
URL under a single alias to agregate statistics, in this case create
a file with this format :
URL_ALIAS URL_REGEXP1,URL_REGEXP2,...
Separator must be a tabulation. See network-aliases file for examples.
=item AnonymizeLogin 0 =item AnonymizeLogin 0
@ -435,7 +355,7 @@ Value can be: bytes or hits. Default is bytes.
=item UrlReport 0|1 =item UrlReport 0|1
Should SquidAnalyzer display user url details. This will show all Should SquidAnalyzer display user url details. This will show all
URL read by user. Take care to have enough space disk for large URL read by user. Take care to have enougth space disk for large
user. Default is 0, no url detail report. user. Default is 0, no url detail report.
=item UserReport 0|1 =item UserReport 0|1
@ -459,13 +379,13 @@ Default is 0, verbose mode.
=item CostPrice price/Mb =item CostPrice price/Mb
Used to set a cost of the bandwidth per Mb. If you want to generate Used to set a cost of the bandwith per Mb. If you want to generate
invoice per Mb for bandwidth traffic this can help you. Value 0 mean invoice per Mb for bandwith traffic this can help you. Value 0 mean
no cost, this is the default value, the "Cost" column is not displayed no cost, this is the default value, the "Cost" column is not displayed
=item Currency currency_abbreviation =item Currency currency_abreviation
Used to set the currency of the bandwidth cost. Preferably the html Used to set the currency of the bandwith cost. Preferably the html
special character. Default is &euro; special character. Default is &euro;
=item TopNumber number =item TopNumber number
@ -473,18 +393,6 @@ special character. Default is &euro;
Used to set the number of top url and second level domain to show. Used to set the number of top url and second level domain to show.
Default is top 100. Default is top 100.
=item TopDenied number
Used to set the number of top denied url to show.
Default is top 100.
=item TopStorage number
Top number of url to preserve on each data file sorted by OrderUrl.
On huge access log it will improve a lot the performances but you
will have less precision in the top url. Default to 0, all url will
be stored.
=item TopUrlUser =item TopUrlUser
Use this directive to show the top N users that look at an URL or a domain. Use this directive to show the top N users that look at an URL or a domain.
Set it to 0 to disable this feature. Default is top 10. Set it to 0 to disable this feature. Default is top 10.
@ -500,7 +408,7 @@ exclusion (USER, CLIENT or URI) and a space separated list of valid regex.
You can also use the NETWORK type to define network address with netmask You can also use the NETWORK type to define network address with netmask
using the CIDR notation: xxx.xxx.xxx.xxx/n using the CIDR notation: xxx.xxx.xxx.xxx/n
See example below: See example bellow:
NETWORK 192.168.1.0/24 10.10.0.0/16 NETWORK 192.168.1.0/24 10.10.0.0/16
CLIENT 192\.168\.1\.2 CLIENT 192\.168\.1\.2
@ -524,7 +432,7 @@ inclusion (USER or CLIENT) and a space separated list of valid regex.
You can also use the NETWORK type to define network address with netmask You can also use the NETWORK type to define network address with netmask
using the CIDR notation: xxx.xxx.xxx.xxx/n using the CIDR notation: xxx.xxx.xxx.xxx/n
See example below: See example bellow:
NETWORK 192.168.1.0/24 10.10.0.0/16 NETWORK 192.168.1.0/24 10.10.0.0/16
CLIENT 192\.168\.1\.2 CLIENT 192\.168\.1\.2
@ -576,7 +484,7 @@ peer cache hit onto your stats.
=item TransfertUnit =item TransfertUnit
Allow one to change the default unit used to display transfert size. Default Allow to change the default unit used to display transfert size. Default
is BYTES, other possible values are KB, MB and GB. is BYTES, other possible values are KB, MB and GB.
=item MinPie =item MinPie
@ -596,63 +504,30 @@ with a Locale set to fr_FR.
=item MaxFormatError =item MaxFormatError
When SquidAnalyzer find a corrupted line in his data file, it exit immediately. When SquidAnalyzer find a corrupted line in his data file, it exit immedialtly.
You can force him to wait for a certain amount of errors before exiting. Of You can force him to wait for a certain amount of errors before exiting. Of
course you might want to remove the corrupted line before the next run. This course you might want to remove the corrupted line before the next run. This
can be useful if you have special characters in some fields like mime type. can be useful if you have special characters in some fields like mime type.
=item TimeZone =item TimeZone
Adjust timezone to use when SquidAnalyzer reports different time than graphs Set timezone to use when SquidAnalyzer is used in a different server than
in your browser. The value must follow format: +/-HH. Default is to use local the one running squid and there is a different timezone between these two
time. This must not be considered as real timezone but the number of hours to machines. The value must follow format: +/-HH. Default is to use local time.
add or remove from log timestamp to have the right hours reported in graphs.
The value can also be set to: auto, in this case SquidAnalyzer will autodetect
the timezone and apply it.
For example: For example:
TimeZone +01 TimeZone +01
will append one hour to all timestamp. for a log file generated on zone Europe/Paris with UTC+0100 and parsed on a
computer with different timezone.
Additionaly TimeZone can be set to auto:
TimeZone auto
to let SquidAnalyzer auto detect the timezone to use.
=item UseUrlPort
Enable this directive if you want to include port number into Url statistics.
Default is to remove the port information from the Url.
=item UpdateAlias
Enable this directive if you want to apply immediately the changes made in
aliases files to avoid duplicates. You still have to use --rebuild to
recreate previous reports with new aliases. Enabling this will imply a lost
of performances with huges log files.
=item TimeStart and TimeStop
The two following configuration directive allow you to specify a start and
stop time. Log line out of this time range will not be parsed. The format
of the value is HH:MM
=item RefreshTime
Insert a html Refresh tag into all index.html files. The value is the
refresh intervalle in minutes. Default to 5 minutes. Can be overridden
at command line with option -R | --refresh
=back =back
=head1 SUPPORT =head1 SUPPORT
=head2 Release announcement =head2 Release annoucement
Please follow us on twitter to receive release announcement and latest news : https://twitter.com/SquidAnalyzer Please follow us on twitter to receive release annoucement and latest news : https://twitter.com/SquidAnalyzer
=head2 Bugs and Feature requests =head2 Bugs and Feature requests
@ -664,7 +539,7 @@ https://github.com/darold/squidanalyzer.
Any contribution to build a better tool is welcome, you just have to send me your ideas, features request or Any contribution to build a better tool is welcome, you just have to send me your ideas, features request or
patches using the tools on the git repository at https://github.com/darold/squidanalyzer patches using the tools on the git repository at https://github.com/darold/squidanalyzer
You can also support the developer by donate some contribution by clicking on the "Donate" button on the You can also support the developper by donate some contribution by clicking on the "Donate" button on the
SquidAnalyzer web site at http://squidanalyzer.darold.net/ SquidAnalyzer web site at http://squidanalyzer.darold.net/
=head1 AUTHOR =head1 AUTHOR
@ -673,7 +548,7 @@ Gilles DAROLD <gilles@darold.net>
=head1 COPYRIGHT =head1 COPYRIGHT
Copyright (c) 2001-2019 Gilles DAROLD Copyright (c) 2001-2016 Gilles DAROLD
This package is free software and published under the GPL v3 or above This package is free software and published under the GPL v3 or above
license. license.

View File

@ -7,7 +7,7 @@
# You can also use the NETWORK type to define network address with netmask # You can also use the NETWORK type to define network address with netmask
# using the CIDR notation: xxx.xxx.xxx.xxx/n # using the CIDR notation: xxx.xxx.xxx.xxx/n
# #
# See example below: # See example bellow:
#------------------------------------------------------------------------------ #------------------------------------------------------------------------------
#NETWORK 192.168.1.0/24 10.10.0.0/16 #NETWORK 192.168.1.0/24 10.10.0.0/16
#CLIENT 192\.168\.1\.2 #CLIENT 192\.168\.1\.2

View File

@ -8,7 +8,7 @@
# You can also use the NETWORK type to define network address with netmask # You can also use the NETWORK type to define network address with netmask
# using the CIDR notation: xxx.xxx.xxx.xxx/n # using the CIDR notation: xxx.xxx.xxx.xxx/n
# #
# See example below: # See example bellow:
#------------------------------------------------------------------------------ #------------------------------------------------------------------------------
#NETWORK 192.168.1.0/24 10.10.0.0/16 #NETWORK 192.168.1.0/24 10.10.0.0/16
#CLIENT 192\.168\.1\.2 #CLIENT 192\.168\.1\.2

View File

@ -22,9 +22,8 @@ UseClientDNSName 0
# If you have enabled UseClientDNSName and have lot of ip addresses that do # If you have enabled UseClientDNSName and have lot of ip addresses that do
# not resolve you may want to increase the DNS lookup timeout. By default # not resolve you may want to increase the DNS lookup timeout. By default
# SquidAnalyzer will stop to lookup a DNS name after 100 ms. The value must # SquidAnalyzer will stop to lookup a DNS name after 0.0001 second (100 ms).
# be set in millisecond. DNSLookupTimeout 0.0001
DNSLookupTimeout 100
# Set the file containing network alias name. Network are # Set the file containing network alias name. Network are
# show as Ip addresses so if you want to display name instead # show as Ip addresses so if you want to display name instead
@ -40,14 +39,7 @@ NetworkAlias /etc/squidanalyzer/network-aliases
# Separator must be a tabulation # Separator must be a tabulation
UserAlias /etc/squidanalyzer/user-aliases UserAlias /etc/squidanalyzer/user-aliases
# Set the file containing url alias name. You may want to group Url # How do we sort Network, User and Url report screen
# under a single alias to agregate statistics, in this case create
# a file with this format :
# URL_ALIAS URL_REGEXP
# Separator must be a tabulation
UrlAlias /etc/squidanalyzer/url-aliases
# How do we sort Network, User and user's Url report screen
# Value can be: bytes, hits or duration. Default is bytes. # Value can be: bytes, hits or duration. Default is bytes.
OrderNetwork bytes OrderNetwork bytes
OrderUser bytes OrderUser bytes
@ -72,25 +64,16 @@ UserReport 1
# Run in quiet mode or print debug information # Run in quiet mode or print debug information
QuietMode 1 QuietMode 1
# Cost of the bandwidth per Mb. If you want to generate invoice per Mb # Cost of the bandwith per Mb. If you want to generate invoice per Mb
# for bandwidth traffic this can help you. Value 0 mean no cost. # for bandwith traffic this can help you. Value 0 mean no cost.
CostPrice 0 CostPrice 0.5
# Currency of the bandwidth cost # Currency of the bandwith cost
Currency &euro; Currency &euro;
# Top number of url to show from all url extracted from the log # Top number of url to show
TopNumber 100 TopNumber 100
# Top number of denied URL to show
TopDenied 100
# Top number of url to preserve on each data file sorted by OrderUrl.
# On huge access log it will improve a lot the performances but you
# will have less precision in the top url. Default to 0, all url will
# be stored.
TopStorage 0
# Path to the file containing client ip addresses, network ip address, # Path to the file containing client ip addresses, network ip address,
# and/or auth login to exclude from report # and/or auth login to exclude from report
Exclude /etc/squidanalyzer/excluded Exclude /etc/squidanalyzer/excluded
@ -101,7 +84,7 @@ Exclude /etc/squidanalyzer/excluded
Include /etc/squidanalyzer/included Include /etc/squidanalyzer/included
# Translation Lang /etc/squidanalyzer/lang/en_US.txt, # Translation Lang /etc/squidanalyzer/lang/en_US.txt,
# en_US.txt, ru_RU.txt, uk_UA.txt, cs_CZ.txt, pl_PL.txt, hu_HU.txt and de_DE.txt). # en_US.txt, ru_RU.txt, uk_UA.txt, cs_CZ.txt, pl_PL.txt and de_DE.txt).
# Default to: # Default to:
#Lang /etc/squidanalyzer/lang/en_US.txt #Lang /etc/squidanalyzer/lang/en_US.txt
@ -151,12 +134,6 @@ TopUrlUser 10
# Feel free to define your own header but take care to not break current design. # Feel free to define your own header but take care to not break current design.
#CustomHeader <a href="http://my.isp.dom/"><img src="http://my.isp.dom/logo.png" title="My ISP link" border="0" width="100" height="110"></a> My ISP Company #CustomHeader <a href="http://my.isp.dom/"><img src="http://my.isp.dom/logo.png" title="My ISP link" border="0" width="100" height="110"></a> My ISP Company
# This directive allow you to replace the HTML page title by your custom title
# The default value is defined as follow:
# SquidAnalyzer $VERSION Report
# Feel free to define your own title but take care to not break current design.
#CustomTitle My ISP Company Report
# This directive allow exclusion of some unwanted methods in report statistics # This directive allow exclusion of some unwanted methods in report statistics
# like HEAD, POST, CONNECT, etc. Can be a comma separated list of methods. # like HEAD, POST, CONNECT, etc. Can be a comma separated list of methods.
#ExcludedMethods HEAD #ExcludedMethods HEAD
@ -177,34 +154,8 @@ TopUrlUser 10
# can be useful if you have special characters in some fields like mime type. # can be useful if you have special characters in some fields like mime type.
#MaxFormatError 0 #MaxFormatError 0
# Adjust timezone to use when SquidAnalyzer reports different time than graphs # Set timezone to use when SquidAnalyzer is used in a different server than
# in your browser. The value must follow format: +/-HH. Default is to use local # the one running squid and there is a different timezone between these two
# time. This must not be considered as real timezone but the number of hours to # machine. The value must follow format: +/-HH. Default is to use local time.
# add or remove from log timestamp to obtain the right hours in the graphs. The #TimeZone +01
# value can also be set to auto, in this case SquidAnalyzer will autodetect the
# timezone and apply it.
#TimeZone +00
# Enable this directive if you want to include port number into Url statistics.
# Default is to remove the port information from the Url.
#UseUrlPort 0
# Enable this directive if you want to apply immedialtly the changes made in
# aliases files to avoid duplicates. You still have to use --rebuild to
# recreate previous reports with new aliases. Enabling this will imply a lost
# of performances with huges log files.
#UpdateAlias 0
# The two following configuration directive allow you to specify a start and
# stop time. Log line out of this time range will not be parsed.
#TimeStart 00:00
#TimeStop 23:59
# Insert a html Refresh tag into all index.html files. The value is the
# refresh intervalle in minutes. Default to 5 minutes. Can be overridden
# at command line with option -R | --refresh
RefreshTime 5
# Store user different ip addresses used along the time in user statistics.
# Default: no extra storage
StoreUserIp 0

View File

@ -1,9 +0,0 @@
#-------------------------------------------------------------------------------
# Squid Analyzer Url Alias configuration file
# FORMAT : URL_ALIAS URL_REGEXP1,URL_REGEXP2,...
# Field separator must be one or more tabulation.
# See example below
#-------------------------------------------------------------------------------
#VIDEO .*\.googlevideo\.com.*,.*\.youtube\.com.*,.*\.viadeo.com.*
#FACEBOOK .*\.fbcdn\.net.*,.*\.facebook\.com.*
#GOOGLE .*\.google.*

View File

@ -1,8 +1,8 @@
#------------------------------------------------------------------------------- #-------------------------------------------------------------------------------
# Squid Analyzer User Alias configuration file # Squid Analyzer User Alias configuration file
# FORMAT : FULL_USER_NAME IP_ADDRESS|LOGIN_NAME,LOGIN_REGEX # FORMAT : FULL_USER_NAME IP_ADDRESS|LOGIN_NAME,LOGIN_REGEX
# Field separator must be one or more tabulation. Space in user are allowed. # Field separator must be one or more tabulation. Space in user name are not
# See example below # allowed. See example bellow
#------------------------------------------------------------------------------- #-------------------------------------------------------------------------------
#MyFirstName mylogin,192.168.1.12 #MyFirstName mylogin,192.168.1.12
#MyOtherNames logon\d+ #MyOtherNames logon\d+

View File

@ -85,7 +85,6 @@ Largest El més llarg
Url Url Url Url
User_title Estadístiques d'usuari en User_title Estadístiques d'usuari en
User_number Nombre d'usuaris User_number Nombre d'usuaris
Url_title Top %d site
Url_Hits_title Top %d de clicks en Url_Hits_title Top %d de clicks en
Url_Bytes_title Top %d de bytes de Url en Url_Bytes_title Top %d de bytes de Url en
Url_Duration_title Top %d de duració de visita en Url_Duration_title Top %d de duració de visita en
@ -107,13 +106,9 @@ Up_link A dalt
Click_year_stat Feu clic a 'Estadístiques anuals' per a més detalls Click_year_stat Feu clic a 'Estadístiques anuals' per a més detalls
Mime_graph_hits_title Estadístiques de mime-types en Mime_graph_hits_title Estadístiques de mime-types en
Mime_graph_bytes_title Estadístiques de Mbytes en mime-types en Mime_graph_bytes_title Estadístiques de Mbytes en mime-types en
User Usuari User Ususari
Count Contador Count Contador
WeekDay Dg Dl Dt Dc Dj Dv Ds WeekDay Dg Dl Dt Dc Dj Dv Ds
Week Setmana Week Setmana
Top_denied_link Top denegat Top_denied_link Top denegat
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Usuari Ip

View File

@ -85,7 +85,6 @@ Largest Největší
Url Url Url Url
User_title Uživatelská Statistika User_title Uživatelská Statistika
User_number Počet uživatelů User_number Počet uživatelů
Url_title Top %d webů
Url_Hits_title Top %d Url hitů Url_Hits_title Top %d Url hitů
Url_Bytes_title Top %d Url Bytů Url_Bytes_title Top %d Url Bytů
Url_Duration_title Top %d Url doba přenosu Url_Duration_title Top %d Url doba přenosu
@ -113,7 +112,3 @@ WeekDay Ne Po Út St Čt Pá So
Week Týden Week Týden
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title BLocklist ACL use Blocklist_acl_title BLocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Uživatelská Ip

View File

@ -85,7 +85,6 @@ Largest Groesste
Url Url Url Url
User_title Benutzer Statistik User_title Benutzer Statistik
User_number Anzahl der Benutzer User_number Anzahl der Benutzer
Url_title Top %d Seiten
Url_Hits_title Top %d Url Treffer Url_Hits_title Top %d Url Treffer
Url_Bytes_title Top %d Url Bytes Url_Bytes_title Top %d Url Bytes
Url_Duration_title Top %d Url Dauer Url_Duration_title Top %d Url Dauer
@ -111,9 +110,5 @@ User Benutzer
Count Anzahl Count Anzahl
WeekDay So Mo Di Mi Do Fr Sa WeekDay So Mo Di Mi Do Fr Sa
Week Woche Week Woche
Top_denied_link Top Abgelehnt Top_denied_link Top Denied
Blocklist_acl_title Blocklisten ACL Nutzung Blocklist_acl_title Blocklist ACL use
Throughput Durchsatz
Graph_throughput_title %s Durchsatz bei
Throughput_graph Bytes/Sek.
User_Ip Benutzer Ip

View File

@ -85,7 +85,6 @@ Largest Largest
Url Url Url Url
User_title User Statistics on User_title User Statistics on
User_number Number of users User_number Number of users
Url_title Top %d site
Url_Hits_title Top %d Url hits on Url_Hits_title Top %d Url hits on
Url_Bytes_title Top %d Url bytes on Url_Bytes_title Top %d Url bytes on
Url_Duration_title Top %d Url duration on Url_Duration_title Top %d Url duration on
@ -113,7 +112,3 @@ WeekDay Su Mo Tu We Th Fr Sa
Week Week Week Week
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip User Ip

View File

@ -85,7 +85,6 @@ Largest El más largo
Url Url Url Url
User_title Estadísticas de usuario en User_title Estadísticas de usuario en
User_number Número de usuario User_number Número de usuario
Url_title Top %d Url
Url_Hits_title Top %d de clicks en Url_Hits_title Top %d de clicks en
Url_Bytes_title Top %d de bytes de Url en Url_Bytes_title Top %d de bytes de Url en
Url_Duration_title Top %d de duración de visita enn Url_Duration_title Top %d de duración de visita enn
@ -113,7 +112,3 @@ WeekDay Do Lu Ma Mi Ju Vi Sa
Week Semana Week Semana
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Usuario Ip

View File

@ -85,7 +85,6 @@ Largest Plus gros transfert
Url Url Url Url
User_title Statistiques utilisateurs pour User_title Statistiques utilisateurs pour
User_number Nombre d'utilisateurs User_number Nombre d'utilisateurs
Url_title Top %d des sites
Url_Hits_title Top %d des Urls par requêtes pour Url_Hits_title Top %d des Urls par requêtes pour
Url_Bytes_title Top %d des Urls par transferts pour Url_Bytes_title Top %d des Urls par transferts pour
Url_Duration_title Top %d des Urls par durée pour Url_Duration_title Top %d des Urls par durée pour
@ -113,7 +112,3 @@ WeekDay Di Lu Ma Me Je Ve Sa
Week Semaine Week Semaine
Top_denied_link Top Rejets Top_denied_link Top Rejets
Blocklist_acl_title Utilisation des ACL de Blocklist Blocklist_acl_title Utilisation des ACL de Blocklist
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Ip utilisateur

View File

@ -1,119 +0,0 @@
#------------------------------------------------------------------------------
# This is the translation file of SquidAnalyzer program. The first column
# represente the program access key to translated string and the second
# column is the translated string itself.
# Keys should not be modified and are case sensitive. The column separator
# is the tabulation character.
#
# Specials tags %s and %d in the translated string are uses by the program to
# replace dynamics values. Following the language their place in the string
# may vary.
#
# Author: Gilles Darold
#------------------------------------------------------------------------------
CharSet utf-8
01 Jan
02 Febr
03 Márc
04 Ápr
05 Máj
06 Jún
07 Júl
08 Aug
09 Szept
10 Okt
11 Nov
12 Dec
KB Kilobytes
MB Megabytes
GB Gigabytes
Requests Kérés
Bytes Bytes
Megabytes Megabytes
Total Összesen
Years Évek
Months Hónapok
Days Napok
Hit Találat
Miss Letöltött
Denied Visszautasított
Cost Érték
Users Felhasználók
Sites Oldalak
Domains Domain nevek
Requests_graph Kérések
Megabytes_graph Megabyte-ok
Months_graph Hónapok
Days_graph Napok
Hit_graph Találat
Miss_graph Letöltött
Denied_graph Visszautasított
Total_graph Mindösszesen
Domains_graph Domain nevek
Users_help Felhasználók teljes száma erre az időszakra
Sites_help Meglátogatott oldalak teljes száma erre az időszakra
Domains_help Meglátogatott második szintű domain nevek teljes száma erre az időszakra
Hit_help Oldalak, képek, stb. amit megtalált a gyorsítótárban
Miss_help Oldalak, képek, stb. amit nem talált meg a gyorsítótárban
Denied_help Oldalak, képek, stb. amit visszautasított
Cost_help 1 Megabyte =
Generation A kimutatás -
Main_cache_title Cache statisztika
Cache_title Cache statisztika -
Stat_label Stat
Mime_link Mime típusok
Network_link Hálózatok
User_link Felhasználók
Top_url_link Leggyakoribb url-ek
Top_domain_link Leggyakoribb domain nevek
Back_link Vissza
Graph_cache_hit_title %s kérés statisztika -
Graph_cache_byte_title %s Megabyte-ok statisztika -
Hourly Óránként
Hours Órák
Daily Napi
Days Napok
Monthly Havi
Months Hónapok
Mime_title Mime típusok statisztika -
Mime_number Mime típusok száma
Network_title Hálózatok statisztikája -
Network_number Hálózatok száma
Duration Időtartam
Time Idő
Largest Legnagyobb
Url Url
User_title Felhasználói statisztika -
User_number Felhasználók száma
Url_title Első %d oldal
Url_Hits_title Első %d url találat -
Url_Bytes_title Első %d url byte-ok -
Url_Duration_title Első %d url időtartam -
Url_number Url-ek száma
Domain_Hits_title Első %d domain találat -
Domain_Bytes_title Első %d Domain byte-ok -
Domain_Duration_title Első %d Domain időtartam on
Domain_number Domain-ek száma
Domain_graph_hits_title Domain találatok statisztika -
Domain_graph_bytes_title Domain byte-ok statisztika -
Second_domain_graph_hits_title Második szintű találatok statisztika -
Second_domain_graph_bytes_title Második szintű byte statisztika -
First_visit Első látogatás
Last_visit utolsó látogatás
Globals_Statistics Mindösszesen statisztikák
Legend Magyarázat
File_Generated File készült
Up_link Fel
Click_year_stat Kattintson az éves statisztikák link-re a részletekért
Mime_graph_hits_title Mime típusok találat statisztika -
Mime_graph_bytes_title Mime típusok MByte-ok statisztika -
User Felhasználó
Count Darabszám
WeekDay Va Hé Ke Sze Csü Pé Szo
Week Hét
Top_denied_link Legtöbbet visszautasított
Blocklist_acl_title Használt tiltólisták (ACL)
Throughput Átviteli sebesség
Graph_throughput_title %s átviteli sebesség -
Throughput_graph Bytes/sec
User_Ip Felhasználó IP címe

View File

@ -86,7 +86,6 @@ Largest Dim. Max DL
Url Url Url Url
User_title Statistiche utente per User_title Statistiche utente per
User_number Numero di utenti User_number Numero di utenti
Url_title Principali %d siti
Url_Hits_title Principali %d Url acceduti con successo Url_Hits_title Principali %d Url acceduti con successo
Url_Bytes_title Principali %d Url per numero di byte per Url_Bytes_title Principali %d Url per numero di byte per
Url_Duration_title Principali %d Url per durata temporale Url_Duration_title Principali %d Url per durata temporale
@ -114,7 +113,3 @@ Week Settimana
Weekly Settimanali Weekly Settimanali
Top_denied_link Classifica Link Vietati Top_denied_link Classifica Link Vietati
Blocklist_acl_title Utilizzo ACL Blocklist Blocklist_acl_title Utilizzo ACL Blocklist
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Utente Ip

View File

@ -85,7 +85,6 @@ Largest Największy
Url Url Url Url
User_title Statystyki użytkowników z dnia User_title Statystyki użytkowników z dnia
User_number Liczba użytkowników User_number Liczba użytkowników
Url_title Topowych %d witryn
Url_Hits_title Topowych %d trafień dla Url w Url_Hits_title Topowych %d trafień dla Url w
Url_Bytes_title Topowych %d bajtów dla Url w Url_Bytes_title Topowych %d bajtów dla Url w
Url_Duration_title Topowych %d czasów trwania dla Url w Url_Duration_title Topowych %d czasów trwania dla Url w
@ -113,7 +112,3 @@ WeekDay Nd Pn Wt Śr Cz Pt So
Week Tydzień Week Tydzień
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Użytkownik Ip

View File

@ -87,7 +87,6 @@ Largest Maior
Url URL Url URL
User_title Estat&iacute;sticas de usu&aacute;rios em User_title Estat&iacute;sticas de usu&aacute;rios em
User_number N&uacute;mero de usu&aacute;rios User_number N&uacute;mero de usu&aacute;rios
Url_title As Top %d sites
Url_Hits_title As Top %d URLs por n&uacute;mero de acessos em Url_Hits_title As Top %d URLs por n&uacute;mero de acessos em
Url_Bytes_title As Top %d URLs por bytes transferidos em Url_Bytes_title As Top %d URLs por bytes transferidos em
Url_Duration_title As Top %d URLs por tempo de transfer&ecirc;ncia em Url_Duration_title As Top %d URLs por tempo de transfer&ecirc;ncia em
@ -115,7 +114,3 @@ WeekDay Su Mo Tu We Th Fr Sa
Week Week Week Week
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Usu&aacute;rios Ip

View File

@ -8,7 +8,7 @@
# Specials tags %s and %d in the translated string are uses by the program to # Specials tags %s and %d in the translated string are uses by the program to
# replace dynamics values. Following the language their place in the string # replace dynamics values. Following the language their place in the string
# may vary. # may vary.
# Version 1.2.2 # Version 1.2.1
# Author: Gilles Darold # Author: Gilles Darold
# Russian translation by oxygen121. # Russian translation by oxygen121.
# Additions and corrections by yvoinov. # Additions and corrections by yvoinov.
@ -88,7 +88,6 @@ Largest Наибольший
Url URL Url URL
User_title Статистика по пользователям за User_title Статистика по пользователям за
User_number Количество пользователей User_number Количество пользователей
Url_title Рейтинг %d URL
Url_Hits_title %d самых посещаемых URL-запросов за Url_Hits_title %d самых посещаемых URL-запросов за
Url_Bytes_title %d URL с наибольшим трафиком за Url_Bytes_title %d URL с наибольшим трафиком за
Url_Duration_title %d URL с наибольшей продолжительностью за Url_Duration_title %d URL с наибольшей продолжительностью за
@ -111,12 +110,8 @@ Click_year_stat Нажмите на готовую статистику для
Mime_graph_hits_title Типы MIME (запросы) за Mime_graph_hits_title Типы MIME (запросы) за
Mime_graph_bytes_title Типы MIME (трафик) за Mime_graph_bytes_title Типы MIME (трафик) за
User Пользователь User Пользователь
Count Счетчик Count Считать
WeekDay Вс Пн Вт Ср Чт Пт Сб WeekDay Вс Пн Вт Ср Чт Пт Сб
Week Неделя Week Неделя
Top_denied_link Рейтинг отказов в доступе Top_denied_link Рейтинг отказов в доступе
Blocklist_acl_title Использованные блок-листы (ACL) Blocklist_acl_title Использованные блок-листы (ACL)
Throughput Пропускная способность
Graph_throughput_title %s пропускная способность на
Throughput_graph Байт/сек
User_Ip Пользователь Ip

View File

@ -86,7 +86,6 @@ Largest Найбільший
Url URL Url URL
User_title Статистика по Користувачам за User_title Статистика по Користувачам за
User_number Кількість Користувачів User_number Кількість Користувачів
Url_title Toп %d URL
Url_Hits_title Toп %d URL запитів за Url_Hits_title Toп %d URL запитів за
Url_Bytes_title Toп %d URL трафік за Url_Bytes_title Toп %d URL трафік за
Url_Duration_title Toп %d URL тривалість за Url_Duration_title Toп %d URL тривалість за
@ -114,7 +113,3 @@ WeekDay Нд Пн Вт Ср Чт Пт Сб
Week Тиждень Week Тиждень
Top_denied_link Top Denied Top_denied_link Top Denied
Blocklist_acl_title Blocklist ACL use Blocklist_acl_title Blocklist ACL use
Throughput Throughput
Graph_throughput_title %s throughput on
Throughput_graph Bytes/sec
User_Ip Користувач Ip

View File

@ -1,25 +0,0 @@
RPM/
Holds squidanalyzer.spec need to build an RPM package for RH/CentOs/Fedora.
It may also be usable for other RPM based distribution.
Copy the squidanalyzer source tarball under:
~/rpmbuild/SOURCES/
Then create the RPM binary package as follow:
rpmbuild -bb squidanalyzer.spec
The binary package may be found here:
~/rpmbuild/RPMS/noarch/squidanalyzer-6.6-1.noarch.rpm
To check which file will be installed and where:
rpm -qlp ~/rpmbuild/RPMS/noarch/squidanalyzer-6.6-1.el7.noarch.rpm
To install run:
rpm -i ~/rpmbuild/RPMS/noarch/squidanalyzer-6.6-1.noarch.rpm

View File

@ -1,16 +1,17 @@
%define webdir /var/www %define contentdir /var/www
Summary: Squid proxy log analyzer and report generator Summary: Squid proxy log analyzer and report generator
Name: squidanalyzer Name: squidanalyzer
Version: 6.6 Version: 6.5
Release: 1%{?dist} Release: 1
License: GPLv3+ License: GPLv3
Group: Applications/Internet Group: Monitoring
URL: http://squidanalyzer.darold.net/ URL: http://%{name}.darold.net/
Source: http://prdownloads.sourceforge.net/squid-report/%{name}-%{version}.tar.gz Source: http://prdownloads.sourceforge.net/squid-report/%{name}-%{version}.tar.gz
BuildRequires: perl BuildRequires: perl
BuildArch: noarch BuildArch: noarch
Buildroot: %{_tmppath}/%{name}-%{version}-%{release}-root-%(%{__id_u} -n)
BuildRequires: perl-ExtUtils-MakeMaker, perl-ExtUtils-Install, perl-ExtUtils-Manifest, perl-ExtUtils-ParseXS, perl-Time-HiRes BuildRequires: perl-ExtUtils-MakeMaker, perl-ExtUtils-Install, perl-ExtUtils-Manifest, perl-ExtUtils-ParseXS, perl-Time-HiRes
BuildRequires: gdbm-devel, libdb-devel, perl-devel, systemtap-sdt-devel BuildRequires: gdbm-devel, libdb-devel, perl-devel, systemtap-sdt-devel
@ -31,51 +32,43 @@ or more often with heavy proxy usage.
%setup -q %setup -q
%build %build
# Build Makefile for SquidAnalyzer perl Makefile.PL DESTDIR=%{buildroot} LOGFILE=%{_logdir}/squid/access.log BINDIR=%{_bindir} HTMLDIR=%{contentdir}/html/%{name} BASEURL=/%{name} MANDIR=%{_mandir}/man3 QUIET=yes
%{__perl} Makefile.PL INSTALLDIRS=vendor DESTDIR=%{buildroot} LOGFILE=/var/log/squid/access.log BINDIR=%{_bindir} HTMLDIR=%{webdir}/%{name} BASEURL=/%{name} MANDIR=%{_mandir}/man3 QUIET=yes
# Compile
make make
%install %install
# Clear buildroot from previous build rm -rf %{buildroot}
%{__rm} -rf %{buildroot}/
# Make install distrib files make DESTDIR=%{buildroot} install
%{__make} install install etc/* %{buildroot}%{_sysconfdir}/%{name}/
install -d %{buildroot}%{_sysconfdir}/cron.daily
echo -e "#!/bin/sh\n%{_bindir}/squid-analyzer" > %{buildroot}%{_sysconfdir}/cron.daily/0%{name}
# Remove .packlist file (per rpmlint)
%{__rm} -f %{buildroot}/%perl_vendorarch/auto/SquidAnalyzer/.packlist
%{__rm} -f `find %{buildroot}/%{_libdir}/perl*/ -name .packlist -type f`
%{__rm} -f `find %{buildroot}/%{_libdir}/perl*/ -name perllocal.pod -type f`
# Install cron
%{__install} -d %{buildroot}/%{_sysconfdir}/cron.daily
echo -e "#!/bin/sh\n%{_bindir}/squid-analyzer" > %{buildroot}/%{_sysconfdir}/cron.daily/0%{name}
%files %files
%defattr(-, root, root, 0755) %defattr(-,root,root)
%doc README ChangeLog %doc README ChangeLog
%{_mandir}/man3/squid-analyzer.3.gz %{_mandir}/man3/*
%{_mandir}/man3/SquidAnalyzer.3pm.gz
%{perl_vendorlib}/SquidAnalyzer.pm %{perl_vendorlib}/SquidAnalyzer.pm
%attr(0755,root,root) %{_bindir}/squid-analyzer %attr(0755,root,root) %{_bindir}/squid-analyzer
%attr(0755,root,root) %{_libdir}/perl5/perllocal.pod
%attr(0755,root,root) %{_libdir}/perl5/vendor_perl/auto/SquidAnalyzer/.packlist
%attr(0755,root,root) %dir %{_sysconfdir}/%{name} %attr(0755,root,root) %dir %{_sysconfdir}/%{name}
%attr(0664,root,root) %config(noreplace) %{_sysconfdir}/%{name}/%{name}.conf %attr(0664,root,root) %config(noreplace) %{_sysconfdir}/%{name}/%{name}.conf
%config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/excluded %config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/excluded
%config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/included %config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/included
%config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/network-aliases %config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/network-aliases
%config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/user-aliases %config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/user-aliases
%config(noreplace) %attr(0644,root,root) %{_sysconfdir}/%{name}/url-aliases
%config(noreplace) %attr(0754,root,root) %{_sysconfdir}/cron.daily/0%{name} %config(noreplace) %attr(0754,root,root) %{_sysconfdir}/cron.daily/0%{name}
%attr(0755,root,root) %dir %{_sysconfdir}/%{name}/lang %attr(0755,root,root) %dir %{_sysconfdir}/%{name}/lang
%{_sysconfdir}/%{name}/lang/* %{_sysconfdir}/%{name}/lang/*
%attr(0755,root,root) %dir %{webdir}/%{name} %attr(0755,root,root) %dir %{contentdir}/html/%{name}
%{webdir}/%{name}/flotr2.js %{contentdir}/html/%{name}/flotr2.js
%{webdir}/%{name}/sorttable.js %{contentdir}/html/%{name}/sorttable.js
%{webdir}/%{name}/%{name}.css %{contentdir}/html/%{name}/%{name}.css
%attr(0755,root,root) %dir %{webdir}/%{name}/images %attr(0755,root,root) %dir %{contentdir}/html/%{name}/images
%{webdir}/%{name}/images/*.png %{contentdir}/html/%{name}/images/*.png
%clean %clean
%{__rm} -rf %{buildroot} rm -rf %{buildroot}

View File

@ -1,42 +0,0 @@
Licenses
--------
bean.js:
copyright (c) Jacob Thornton 2011-2012
* https://github.com/fat/bean
* MIT license
underscore.js:
copyright (c) 2009-2015 Jeremy Ashkenas, DocumentCloud and Investigative Reporters & Editors
* http://documentcloud.github.com/underscore
* MIT license.
flotr2.js:
copyright (c) 2012 Carl Sutherland
* https://github.com/HumbleSoftware/Flotr2/
* MIT License
The MIT License (MIT)
---------------------
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@ -1,34 +0,0 @@
Resources files are collected from their respective download places as follow:
sorttable:
----------
wget https://kryogenix.org/code/browser/sorttable/sorttable.js -O orig/sorttable.js
SquidAnalyzer use a modified version of the library using patch:
patch -p 1 orig/sorttable.js < sa-sorttable.diff
See file sa-sorttable.diff to see the changes.
bean:
-----
wget https://github.com/fat/bean/archive/v1.0.14.tar.gz
tar xzf v1.0.14.tar.gz bean-1.0.14/src/bean.js
cp bean-1.0.14/src/bean.js orig/bean.js
underscore.js:
--------------
wget http://underscorejs.org/underscore.js -O orig/underscore.js
flotr2:
-------
wget https://raw.githubusercontent.com/HumbleSoftware/Flotr2/master/flotr2.nolibs.js -O orig/flotr2.nolibs.js
Files are minified using yui-compressor before being appended with the content of squidanalyzer.js file
to the final flotr2.js file.

File diff suppressed because one or more lines are too long

View File

@ -1,32 +0,0 @@
--- sorttable.js 2012-10-15 21:11:14.000000000 +0200
+++ ../sorttable.js 2017-02-19 12:09:38.499610414 +0100
@@ -151,6 +151,8 @@
//sorttable.shaker_sort(row_array, this.sorttable_sortfunction);
/* and comment out this one */
row_array.sort(this.sorttable_sortfunction);
+ // SquidAnalyzer: Sort in descending order first
+ row_array.reverse();
tb = this.sorttable_tbody;
for (var j=0; j<row_array.length; j++) {
@@ -266,8 +268,18 @@
return aa-bb;
},
sort_alpha: function(a,b) {
- if (a[0]==b[0]) return 0;
- if (a[0]<b[0]) return -1;
+ // SquidAnalyzer: remove percentage for numeric sort
+ if (a[0].replace(/ <.*\(.*%\).*/, '')) {
+ b[0].replace(/ <.*\(.*%\).*/,'');
+ aa = parseFloat(a[0].replace(/[^0-9.-]/g,''));
+ if (isNaN(aa)) aa = 0;
+ bb = parseFloat(b[0].replace(/[^0-9.-]/g,''));
+ if (isNaN(bb)) bb = 0;
+ return aa-bb;
+ } else {
+ if (a[0]==b[0]) return 0;
+ if (a[0]<b[0]) return -1;
+ }
return 1;
},
sort_ddmm: function(a,b) {

File diff suppressed because one or more lines are too long

View File

@ -18,7 +18,7 @@ body { font-size: 10pt; background-color: #F1F1F1; min-width: 900px; }
#alignLeft { float: left; } #alignLeft { float: left; }
.italicPercent { font-style: italic; font-size: 0.813em; } .italicPercent { font-style: italic; font-size: 0.7em; }
#contenu { padding-bottom: 50px; } #contenu { padding-bottom: 50px; }
@ -34,25 +34,25 @@ div.uplink a { color: #222222; text-decoration: none; font-variant: small-caps;
div.uplink a:hover { color: #76add2; } div.uplink a:hover { color: #76add2; }
table.stata td a.domainLink { font-size: 1em; font-variant: normal; font-style: italic; } table.stata td a.domainLink { font-size: 0.9em; font-variant: normal; font-style: italic; }
table.graphs { margin-right: auto; margin-left: auto; } table.graphs { margin-right: auto; margin-left: auto; }
table.stata th.headerBlack { color: #222222; font: bold 12px "Trebuchet MS", Verdana, Arial, Helvetica, sans-serif; font-variant: small-caps; } table.stata th.headerBlack { color: #222222; font: bold 15px "Trebuchet MS", Verdana, Arial, Helvetica, sans-serif; font-variant: small-caps; }
table.stata { border-collapse: collapse; width: 90%; margin-left:auto; margin-right:auto; margin-bottom: 25px; border: 0px; white-space:nowrap; font-size: 0.9em; } table.stata { border-collapse: collapse; width: 90%; margin-left:auto; margin-right:auto; border: 0px; white-space:nowrap; }
table.stata th { background: #76add2; font: 10px "Trebuchet MS", Verdana, Arial, Helvetica, sans-serif; font-variant: small-caps; font-weight: bold; letter-spacing: 2px; padding-left: 5px; padding-right: 5px; padding-top: 3px; padding-bottom: 3px; border: 2px solid silver; color: #F1F1F1; } table.stata th { background: #76add2; font: 12px "Trebuchet MS", Verdana, Arial, Helvetica, sans-serif; font-variant: small-caps; letter-spacing: 2px; padding-left: 20px; padding-right: 20px; padding-top: 3px; padding-bottom: 3px; border: 2px solid silver; color: #F1F1F1; }
table.stata td { text-align: center; padding-left: 5px; padding-right: 5px; padding-top: 5px; padding-bottom: 5px; border: 2px solid silver; font-style: italic; } table.stata td { text-align: center; padding-left: 20px; padding-right: 20px; padding-top: 5px; padding-bottom: 5px; border: 2px solid silver; font-style: italic; }
table.stata th.nobg { background: none; border-top: 0px; border-left: 0px; padding-left: 5px; padding-right: 5px; padding-top: 3px; padding-bottom: 3px; } table.stata th.nobg { background: none; border-top: 0px; border-left: 0px; padding-left: 20px; padding-right: 20px; padding-top: 3px; padding-bottom: 3px; }
table.stata td a { font-variant: small-caps; text-decoration: none; color: #222222; font-weight: bold; font-style: normal; font-size: 12px; } table.stata td a { font-variant: small-caps; text-decoration: none; color: #222222; font-weight: bold; font-style: normal; font-size: 14px; }
table.stata td a:hover { color: #76add2; } table.stata td a:hover { color: #76add2; }
.displayLegend { margin-left: 50px; } .displayLegend { margin-left: 150px; }
.iconUpArrow { background-image: url("./images/up-arrow.png"); background-position: left bottom; background-repeat: no-repeat; padding-left: 25px; margin-bottom: 20px; } .iconUpArrow { background-image: url("./images/up-arrow.png"); background-position: left bottom; background-repeat: no-repeat; padding-left: 25px; margin-bottom: 20px; }
@ -100,7 +100,7 @@ table.stata td a:hover { color: #76add2; }
ul { padding:0; margin:0; list-style-type:none; } ul { padding:0; margin:0; list-style-type:none; }
li { float:right; } li { float:right; }
ul li a { display:block; width:135px; text-decoration:none; text-align:center; padding:5px; color: white; font-variant: small-caps; letter-spacing: 2px; } ul li a { display:block; width:125px; text-decoration:none; text-align:center; padding:5px; color: white; font-variant: small-caps; letter-spacing: 2px; }
ul li a:hover { color: #76add2; } ul li a:hover { color: #76add2; }
#menu { margin: 0 auto; padding: 0 auto; height: 28px; clear: both; background-color: #222222; border-bottom: 3px solid #76add2; } #menu { margin: 0 auto; padding: 0 auto; height: 28px; clear: both; background-color: #222222; border-bottom: 3px solid #76add2; }
@ -138,10 +138,11 @@ div.information {
div.tooltipLink { position:relative; cursor:pointer; } div.tooltipLink { position:relative; cursor:pointer; }
div.tooltipLink span.information { border-bottom:1px dotted gray; z-index:10; } div.tooltipLink span.information { border-bottom:1px dotted gray; z-index:10; }
div.tooltipLink div.tooltip { display:none; background-color:#EBF0FC; border:1px solid #FFFFFF; -moz-border-radius:10px; padding:6px; width:200px; } div.tooltipLink div.tooltip { display:none; background-color:#EBF0FC; border:1px solid #FFFFFF; -moz-border-radius:10px; padding:6px; width:250px; }
div.tooltipLink div.tooltip table { background-color:white; width:200px; } div.tooltipLink div.tooltip table { background-color:white; width:250px; }
div.tooltipLink div.tooltip table tr.row0 td { background-color: #FFFFFF; border: 1px solid #EEEEEE; } div.tooltipLink div.tooltip table tr.row0 td { background-color: #FFFFFF; border: 1px solid #EEEEEE; }
div.tooltipLink div.tooltip table tr.row1 td { background-color: #EEEEEE; border: 1px solid #EEEEEE; } div.tooltipLink div.tooltip table tr.row1 td { background-color: #EEEEEE; border: 1px solid #EEEEEE; }
div.tooltipLink div.tooltip th { font-size:11px; } div.tooltipLink div.tooltip th { font-size:10px; }
div.tooltipLink div.tooltip td { font-size:10px; font-weight:normal; padding:1px; } div.tooltipLink div.tooltip td { font-size:9px; font-weight:normal; padding:1px; }
div.tooltipLink:hover div.tooltip { display:block; z-index:20; position:absolute; top:1.5em; left:2em; } div.tooltipLink:hover div.tooltip { display:block; z-index:20; position:absolute; top:1.5em; left:2em; }

View File

@ -1,107 +0,0 @@
var round = Math.round;
function toggle(idButton, idDiv, label) {
if(document.getElementById(idDiv)) {
if(document.getElementById(idDiv).style.display == 'none') {
document.getElementById(idDiv).style.display = 'block';
document.getElementById(idButton).value = 'Hide '+label;
} else {
document.getElementById(idDiv).style.display = 'none';
document.getElementById(idButton).value = 'Show '+label;
}
}
}
function dateTracker(obj, gtype, labels, datasets)
{
var dateToDisplay = new Date(parseInt(obj.x));
var posValue = parseInt(obj.x);
// look for the position in data arrays
var pos = 0;
if (datasets != undefined) {
for (pos=0; pos < datasets[0].length; pos++) {
// If timestamp are the same we have found the position
if (datasets[0][pos][0] == posValue) {
// get out of here
break;
}
}
} else {
return '<span class="mfigure">NO DATASET</span>';
}
var textToShow = '<div class="mouse-figures">';
for (var i = 0; i < labels.length; i++) {
if (datasets[i] != undefined) {
textToShow += '<span class="mfigure">'+pretty_print_number(datasets[i][pos][1], gtype)+' <small>'+labels[i]+'</small></span><br>';
}
}
textToShow += '</div>';
return textToShow;
}
function dateTracker2(obj, dtype, gtype)
{
var dateToDisplay = obj.x;
if (dtype == 'month') {
var pos = parseInt(obj.x);
dateToDisplay = months[(pos-1)%12];
}
return dateToDisplay+', '+obj.series.label+': '+round(obj.y);
}
function pretty_print_number(val, type)
{
if (type == 'size') {
if (val >= 1125899906842624) {
val = (val / 1125899906842624);
val = val.toFixed(2) + " PiB";
} else if (val >= 1099511627776) {
val = (val / 1099511627776);
val = val.toFixed(2) + " TiB";
} else if (val >= 1073741824) {
val = (val / 1073741824);
val = val.toFixed(2) + " GiB";
} else if (val >= 1048576) {
val = (val / 1048576);
val = val.toFixed(2) + " MiB";
} else if (val >= 1024) {
val = (val / 1024);
val = val.toFixed(2) + " KiB";
} else {
val = val + " B";
}
} else if (type == 'duration') {
if (val >= 1000) {
val = (val / 1000);
val = val.toFixed(3) + " sec";
} else {
val = val + " ms";
}
} else {
if (val >= 1000000000000000) {
val = (val / 1000000000000000);
val = val.toFixed(2) + " P";
} else if (val >= 1000000000000) {
val = (val / 1000000000000);
val = val.toFixed(2) + " T";
} else if (val >= 1000000000) {
val = (val / 1000000000);
val = val.toFixed(2) + " G";
} else if (val >= 1000000) {
val = (val / 1000000);
val = val.toFixed(2) + " M";
} else if (val >= 1000) {
val = (val / 1000);
val = val.toFixed(2) + " K";
}
}
return val;
}
function pieTracker(obj)
{
return obj.series.label+': '+round(obj.y);
}

View File

@ -1,54 +0,0 @@
#!/bin/sh
#-----------------------------------------------------------------------------
#
# Script used to update squidanalyzer resources files.
# The script must be run in the resources directory.
#
# Files are minified using yui-compressor.
#-----------------------------------------------------------------------------
# Create the temporary directory
mkdir orig/ 2>/dev/null
rm flotr2.js
# Get sorttable.js file
wget https://kryogenix.org/code/browser/sorttable/sorttable.js -O orig/sorttable.js
# SquidAnalyzer use a modified version of the library, apply patch
patch -p 1 orig/sorttable.js < sa-sorttable.diff
yui-compressor orig/sorttable.js -o orig/sorttable.min.js
# Update the flotr2.js script
wget https://raw.githubusercontent.com/HumbleSoftware/Flotr2/master/flotr2.nolibs.js -O orig/flotr2.nolibs.js
yui-compressor orig/flotr2.nolibs.js -o orig/flotr2.min.js
# Update the bean.js script
wget https://github.com/fat/bean/archive/v1.0.14.tar.gz
tar xzf v1.0.14.tar.gz bean-1.0.14/src/bean.js
cp bean-1.0.14/src/bean.js orig/
rm -rf bean-1.0.14/
rm v1.0.14.tar.gz
yui-compressor orig/bean.js -o orig/bean.min.js
# Update underscore.js
wget http://underscorejs.org/underscore.js -O orig/underscore.js
yui-compressor orig/underscore.js -o orig/underscore.min.js
cat squidanalyzer.js >> flotr2.js
echo "/* bean.min.js: see https://github.com/darold/squidanalyzer/tree/master/resources/LICENSE */" >> flotr2.js
cat orig/bean.min.js >> flotr2.js
echo "/* underscore.min.js: see https://github.com/darold/squidanalyzer/tree/master/resources/LICENSE */" >> flotr2.js
cat orig/underscore.min.js >> flotr2.js
echo "/* flotr2.min.js: see https://github.com/darold/squidanalyzer/tree/master/resources/LICENSE */" >> flotr2.js
cat orig/flotr2.min.js >> flotr2.js
cp orig/sorttable.min.js sorttable.js
# Remove temporary directory
rm -rf orig/

View File

@ -2,15 +2,11 @@
# #
# Perl frontend to SquidAnalyzer.pm. # Perl frontend to SquidAnalyzer.pm.
# #
#use lib "PERL5LIB"
use strict; use strict;
use SquidAnalyzer; use SquidAnalyzer;
use Getopt::Long qw(:config no_ignore_case bundling); use Getopt::Long qw(:config no_ignore_case bundling);
use Benchmark; use Benchmark;
use POSIX ":sys_wait_h"; use POSIX ":sys_wait_h";
use Time::Local;
use File::Spec qw/ tmpdir /;
use File::Temp qw/ tempfile /;
$| = 1; $| = 1;
@ -25,22 +21,13 @@ my $preserve = '';
my $debug = 0; my $debug = 0;
my $version = 0; my $version = 0;
my $build_date = ''; my $build_date = '';
my $pid_dir = File::Spec->tmpdir() || '/tmp'; my $pid_dir = '/tmp';
my $pidfile = 'squid-analyzer.pid'; my $pidfile = 'squid-analyzer.pid';
my $queue_size = 0; my $queue_size = 0;
my $timezone = ''; my $timezone = '';
my $no_year_stat = 0; my $no_year_stat = 0;
my $with_month_stat = 0;
my $no_week_stat = 0; my $no_week_stat = 0;
my $t0 = Benchmark->new; my $t0 = Benchmark->new;
my $start_time = '';
my $stop_time = '';
my $start_date = '';
my $stop_date = '';
my $outputdir = '';
my $skip_history = 0;
my $override_history = 0;
my $refresh_time = 0;
# get the command line parameters # get the command line parameters
my $result = GetOptions ( my $result = GetOptions (
@ -50,22 +37,13 @@ my $result = GetOptions (
"h|help" => \$help, "h|help" => \$help,
"j|jobs=i" => \$queue_size, "j|jobs=i" => \$queue_size,
"l|logfile" => \$obsolete, "l|logfile" => \$obsolete,
"o|outputdir=s" => \$outputdir,
"p|preserve=i" => \$preserve, "p|preserve=i" => \$preserve,
"P|pid_dir=s" => \$pid_dir, "P|pid_dir=s" => \$pid_dir,
"r|rebuild!" => \$rebuild, "r|rebuild!" => \$rebuild,
"R|refresh=i" => \$refresh_time,
"s|start=s" => \$start_time,
"S|stop=s" => \$stop_time,
"t|timezone=s" => \$timezone, "t|timezone=s" => \$timezone,
"v|version!" => \$version, "v|version!" => \$version,
"no-year-stat!" => \$no_year_stat, "no-year-stat!" => \$no_year_stat,
"no-week-stat!" => \$no_week_stat, "no-week-stat!" => \$no_week_stat,
"with-month-stat!" => \$with_month_stat,
"startdate=s" => \$start_date,
"stopdate=s" => \$stop_date,
"skip-history!" => \$skip_history,
"override-history!" => \$override_history,
); );
# Show warning for obsolete options # Show warning for obsolete options
@ -87,25 +65,6 @@ if ($build_date) {
} }
} }
if ($start_time && $start_time !~ /^[0-2]\d:[0-5]\d$/) {
die("FATAL: bad format on start time, must be HH:MM.\n");
}
if ($stop_time && $stop_time !~ /^[0-2]\d:[0-5]\d$/) {
die("FATAL: bad format on stop time, must be HH:MM.\n");
}
if ($start_date && $start_date !~ /^\d{4}[-\\\/]?[0-1]\d[-\\\/]?[0-3]\d\s*[0-2]\d[-:]?[0-5]\d[-:]?[0-5]\d$/) {
die("FATAL: bad format on start date, must be YYYYMMDDHHMMSS.\n");
}
$start_date =~ s/[-\\\/:\s]//g if ($start_date);
if ($stop_date && $stop_date !~ /^\d{4}[-\\\/]?[0-1]\d[-\\\/]?[0-3]\d\s*[0-2]\d[-:]?[0-5]\d[-:]?[0-5]\d$/) {
die("FATAL: bad format on stop date, must be YYYYMMDDHHMMSS.\n");
}
$stop_date =~ s/[-\\\/:\s]//g if ($stop_date);
if ($override_history and ! $skip_history) {
die("FATAL: --override-history option can be used only with --skip-history.\nSee usage (--help) for more information.\n");
}
# Add multiple log files given from command line # Add multiple log files given from command line
foreach my $f (@ARGV) { foreach my $f (@ARGV) {
push(@logfile, $f) if (-f $f && !-z $f); push(@logfile, $f) if (-f $f && !-z $f);
@ -133,41 +92,10 @@ close(OUT);
unlink("$pid_dir/last_parsed.tmp"); unlink("$pid_dir/last_parsed.tmp");
# Instanciate SquidAnalyzer.pm perl module # Instanciate SquidAnalyzer.pm perl module
my $sa = new SquidAnalyzer($configfile, join(',', @logfile), $debug, $rebuild, $pid_dir, $pidfile, $timezone, $skip_history, $refresh_time); my $sa = new SquidAnalyzer($configfile, join(',', @logfile), $debug, $rebuild, $pid_dir, $pidfile, $timezone);
$sa->{no_year_stat} = $no_year_stat; $sa->{no_year_stat} = $no_year_stat;
$sa->{with_month_stat} = $with_month_stat;
$sa->{no_week_stat} = $no_week_stat; $sa->{no_week_stat} = $no_week_stat;
$sa->{queue_size} = $queue_size; $sa->{queue_size} = $queue_size;
$sa->{TimeStart} = $start_time;
$sa->{TimeStop} = $stop_time;
$sa->{OverrideHistory} = $override_history;
# Set start and end time (for custom date range reports)
if ($start_date && $start_date =~ /^(\d{4})(\d\d)(\d\d)(\d\d)(\d\d)(\d\d)/) {
my $t = timelocal($6, $5, $4, $3, $2-1, $1);
$sa->{report_starttime} = POSIX::strftime("%a %b %e %H:%M:%S %Y", localtime($t));
--$t; # 1 second less
$sa->{history_time} = $sa->{sg_history_time} = $sa->{ug_history_time} = "$t.999";
print STDERR "DEBUG: report start time set to $sa->{report_starttime}\n" if ($debug);
}
if ($stop_date && $stop_date =~ /^(\d{4})(\d\d)(\d\d)(\d\d)(\d\d)(\d\d)/) {
my $t = timelocal($6, $5, $4, $3, $2-1, $1);
$sa->{report_endtime} = POSIX::strftime("%a %b %e %H:%M:%S %Y", localtime($t));
$sa->{history_endtime} = "$t.999";
print STDERR "DEBUG: report end time set to $sa->{report_endtime}\n" if ($debug);
}
# Set output directory
if ($outputdir) {
die "ERROR: Invalid output directory name specified\n" if ($outputdir !~ /^[-\w\/]+$/);
$outputdir = "$sa->{Output}/$outputdir" if ($outputdir !~ /^\//);
if (! -e $outputdir) {
mkdir ($outputdir) || die "ERROR: can't create directory $outputdir, $!\n";
}
$sa->{Output} = $outputdir;
print STDERR "DEBUG: Output directory set to $outputdir\n" if ($debug);
}
# Die cleanly on signal # Die cleanly on signal
sub terminate sub terminate
@ -194,13 +122,9 @@ sub terminate
if (-e "$pid_dir/$pidfile") { if (-e "$pid_dir/$pidfile") {
unlink("$pid_dir/$pidfile") or print("ERROR: Unable to remove pid file $pid_dir/$pidfile, $!\n"); unlink("$pid_dir/$pidfile") or print("ERROR: Unable to remove pid file $pid_dir/$pidfile, $!\n");
} }
foreach my $tmp_file ('last_parsed.tmp', 'sg_last_parsed.tmp', 'ug_last_parsed.tmp') if (-e "$pid_dir/last_parsed.tmp") {
{ unlink("$pid_dir/last_parsed.tmp") or print("ERROR: Unable to remove temp file $pid_dir/last_parsed.tmp, $!\n");
if (-e "$pid_dir/$tmp_file") }
{
unlink("$pid_dir/$tmp_file") or print("ERROR: Unable to remove temp file $pid_dir/$tmp_file, $!\n");
}
}
exit 0; exit 0;
} }
@ -209,11 +133,10 @@ sub terminate
$SIG{'INT'} = \&terminate; $SIG{'INT'} = \&terminate;
$SIG{'TERM'} = \&terminate; $SIG{'TERM'} = \&terminate;
$SIG{'CHLD'} = 'DEFAULT'; $SIG{'CHLD'} = 'DEFAULT';
$SIG{'HUP'} = 'IGNORE'; # don't die on HUP
my $t1; my $t1;
# Run parsing only if we have a log file or that we are not in rebuild mode # Run parsing only if we have a log file or that we are not in rebuild mode
if (!$rebuild || ($#{$sa->{LogFile}} >= 0)) { if (!$rebuild || ($#logfile >= 0)) {
$sa->parseFile(); $sa->parseFile();
if ($debug) { if ($debug) {
$t1 = Benchmark->new; $t1 = Benchmark->new;
@ -230,7 +153,7 @@ if ($preserve) {
# In rebuild mode history time is not use and we must store the # In rebuild mode history time is not use and we must store the
# specific rebuild date if any is provided at command line. # specific rebuild date if any is provided at command line.
if ($rebuild) { if ($rebuild) {
$sa->{history_time} = $sa->{sg_history_time} = $sa->{ug_history_time} = ''; $sa->{history_time} = '';
$sa->{build_date} = $build_date; $sa->{build_date} = $build_date;
} }
@ -262,23 +185,17 @@ Usage: squid-analyzer [ -c squidanalyzer.conf ] [logfile(s)]
By default: $DEFAULT_CONFFILE By default: $DEFAULT_CONFFILE
-b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd -b | --build_date date : set the date to be rebuilt, format: yyyy-mm-dd
or yyyy-mm or yyyy. Used with -r or --rebuild. or yyyy-mm or yyyy. Used with -r or --rebuild.
-d | --debug : show debug information. -d | --debug : show debug informations.
-h | --help : show this message and exit. -h | --help : show this message and exit.
-j | --jobs number : number of jobs to run at same time. Default -j | --jobs number : number of jobs to run at same time. Default is 1,
is 1, run as single process. run as single process.
-o | --outputdir name : set output directory. If it does not start
with / then prefixes Output from configfile
-p | --preserve number : used to set the statistic obsolescence in -p | --preserve number : used to set the statistic obsolescence in
number of month. Older stats will be removed. number of month. Older stats will be removed.
-P | --pid_dir directory : set directory where pid file will be stored. -P | --pid_dir directory : set directory where pid file will be stored.
Default /tmp/ Default /tmp/
-r | --rebuild : use this option to rebuild all html and graphs -r | --rebuild : use this option to rebuild all html and graphs
output from all data files. output from all data files.
-R | --refresh minutes : add a html refresh tag into index.html file -t, --timezone +/-HH : set number of hours from GMT of the timezone.
with a refresh intervalle in minutes.
-s | --start HH:MM : log lines before this time will not be parsed.
-S | --stop HH:MM : log lines after this time will not be parsed.
-t | --timezone +/-HH : set number of hours from GMT of the timezone.
Use this to adjust date/time of SquidAnalyzer Use this to adjust date/time of SquidAnalyzer
output when it is run on a different timezone output when it is run on a different timezone
than the squid server. than the squid server.
@ -286,16 +203,6 @@ Usage: squid-analyzer [ -c squidanalyzer.conf ] [logfile(s)]
--no-year-stat : disable years statistics, reports will start --no-year-stat : disable years statistics, reports will start
from month level only. from month level only.
--no-week-stat : disable weekly statistics. --no-week-stat : disable weekly statistics.
--with-month-stat : enable month stats when --no-year-stat is used.
--startdate YYYYMMDDHHMMSS : lines before this datetime will not be parsed.
--stopdate YYYYMMDDHHMMSS : lines after this datetime will not be parsed.
--skip-history : used to not take care of the history file. Log
parsing offset will start at 0 but old history
file will be preserved at end. Useful if you
want to parse and old log file.
--override-history : when skip-history is used the current history
file will be overriden by the offset of the
last log file parsed.
Log files to parse can be given as command line arguments or as a comma separated Log files to parse can be given as command line arguments or as a comma separated
list of file for the LogFile configuration directive. By default SquidAnalyer will list of file for the LogFile configuration directive. By default SquidAnalyer will