wtorek, 23 grudnia 2014

$5 EBOOK BONANZA – EVERY TITLE, EVERY TOPIC

With all $5 products available in a range of formats and DRM-free, customers will find great value content delivered exactly how they want it across Packt’s website this Christmas and New Year. From Thursday 18th December, every eBook and video will be available on the publisher’s website for just $5 until 6th January. More info : http://bit.ly/1uW4pQG

czwartek, 7 sierpnia 2014

Apache stats for bots

Last time my server got overload due to heavy queries from bots. I needed to know which bot is so malicious. I wrote simple script to parse apache logs to search boot.
#!/usr/bin/perl use File::Basename; use Time::Piece; use Term::ANSIColor qw(:constants); if (-T $plik){ open(PLIK,"$plik")||die "nie mozna otwoprzyc pliku: $plik!!!\n"; } elsif(-B $plik){ open(PLIK,"zcat $plik |")||die "nie mozna otwoprzyc pliku: $plik!!!\n"; } else { print "Pliku: $plik nie mozna otworzyc\n"; exit; } while(defined($log=)){ my ($host,$date,$reqtype,$url,$proto,$status,$size,$referrer,$agent) = $log =~ m/^(\S+) - - \[(\S+ [\-|\+]\d{4})\] "(GET|POST)\s(.+)\sHTTP\/(\d.\d)" (\d{3}) (\d+|-) "(.*?)" "([^"]+)"$/; if ($status eq "200" && $reqtype eq "GET" && $agent =~ m/bot/i){ my $dt = Time::Piece->strptime($date, '%d/%b/%Y:%H:%M:%S %z'); $date= $dt->strftime('%Y-%m-%d'); $slugnumber{$agent}{$date}{$host}++; $bot{$agent}++; } } close(PLIK); foreach $klucz (sort keys %slugnumber){ print "\n================================================\n"; print BOLD,BLUE,"\n $klucz \n",RESET; foreach $data (keys %{ $slugnumber{$klucz} }){ print BOLD,BLUE,"\n $data \n",RESET; foreach $ipek (keys %{ $slugnumber{$klucz}{$data} }){ print "$klucz $data [$ipek] : $slugnumber{$klucz}{$data}{$ipek}\n" } } }
Below is output:
testing> perl ipstats.pl /var/log/apache/access.log ================================================ Yeti/1.1 (Naver Corp.; http://help.naver.com/robots/) 2014-08-05 Yeti/1.1 (Naver Corp.; http://help.naver.com/robots/) 2014-08-05 [125.209.211.199] : 1 2014-08-04 Yeti/1.1 (Naver Corp.; http://help.naver.com/robots/) 2014-08-04 [125.209.211.199] : 1 ================================================ msnbot/2.0b (+http://search.msn.com/msnbot.htm) 2014-08-05 msnbot/2.0b (+http://search.msn.com/msnbot.htm) 2014-08-05 [65.55.213.247] : 10 msnbot/2.0b (+http://search.msn.com/msnbot.htm) 2014-08-05 [65.55.213.243] : 4 msnbot/2.0b (+http://search.msn.com/msnbot.htm) 2014-08-05 [65.55.213.242] : 2

środa, 25 czerwca 2014

Grep - get first and last line

Last time I had to search my logs for certain message. I needed to connect this with user login/logout time.

I needed estimated time of "start" and "end " occurences in logs ( logs which contains huge messages with different time and same message).

I used sed and grep to this:

root@testing:~# for i in `ls /var/log/syslog/syslog*`;do zgrep 'port 1099' $i | sed -n '1p;$p'; done; Jun 25 08:18:01 testing sshd[33286]: error: connect_to x.y.z.c port 1099: failed. Jun 25 11:30:52 testing sshd[45831]: error: connect_to x.y.z.d port 1099: failed. Jun 24 07:55:04 testing sshd[64527]: error: connect_to x.y.z.d port 1099: failed. Jun 24 11:53:13 testing sshd[64527]: error: connect_to x.y.z.c port 1099: failed. Jun 23 08:59:52 testing sshd[34130]: error: connect_to x.y.z.c port 1099: failed. Jun 23 15:28:38 testing sshd[34130]: error: connect_to x.y.z.d port 1099: failed. Jun 20 08:24:51 testing sshd[64526]: error: connect_to x.y.z.c port 1099: failed. Jun 20 10:55:46 testing sshd[7805]: error: connect_to x.y.z.c port 1099: failed.

poniedziałek, 16 czerwca 2014

Get all files from remote directory using wget

wget -A pdf,jpg -m -p -E -k -K -np http://site/path/

czwartek, 12 czerwca 2014

Check SSL expiration date

Recently I needed to check out my domains due to SSL expiration date. I used script below - it frees up time.

How does it work ?

I need file which contains domeins in form : address:port In loop I check every domain with parameter "-enddate" (certificate expiration date) using openssl. I change dates to timestamp and subtract expiration date from date "now+30days". If result timestamp is lower than daystimestamp (which is number of days in seconds) it warnings me that certificate will expire soon.

shamrock@alucard:~$ cat check_ssl_expiration.sh days=30 daystimestamp=`expr $days \* 86400` for domain in `cat domeny.txt` do expire=`echo | openssl s_client -connect $domain 2>/dev/null | openssl x509 -noout -enddate | awk -F \= '{print $2}'` out=`date -d "$expire" "+%s"` in=`date --date="$days days" "+%s"` res=`expr $out - $in` if [ $res -lt 0 ]; then echo "ALARM !!! DOMAIN $domain EXPIRED ON $expire" elif [ $res -lt $daystimestamp ]; then echo "ALARM !!! DOMAIN $domain WILL EXPIRE ON $expire" else echo "$domain WILL EXPIRE ON $expire" fi done
Usage:
shamrock@alucard:~$ bash check_ssl_expiration.sh poczta.onet.pl:443 WILL EXPIRE ON Oct 14 06:10:33 2014 GMT ALARM !!! DOMAIN some.kind.of.mons.tr:443 EXPIRE ON Jul 3 13:19:18 2013 GMT
File contains domains:
shamrock@alucard:~$ cat domeny.txt poczta.onet.pl:443 some.kind.of.mons.tr:443

wtorek, 22 kwietnia 2014

SSL certificate: DER, PEM and error 20

I had an SSL certificate in a https server, but whenever I connected to this my site, I saw certificate error. First I check that my certificate is in DER form which is not accepted by my server. I have to convert it to PEM :
root@prod:~/cert# openssl x509 -in my_cert.cer -out my_cert.pem -inform DER -outform PEM
Than I saw that I have stil problems with certificate
root@prod:~/cert# openssl verify my_cert.pem: [details removed] error 20 at 0 depth lookup:unable to get local issuer certificate
I googled and found that I should get issuer certificate and combine it with my certificate. I downloaded issuer certificate in DER format and converted it to PEM.
root@prod:~/cert# openssl x509 -in l4.cer -out l4.pem -inform DER -outform PEM
Than I checked if everything is OK:
root@prod:~/cert# openssl verify -CAfile l4.pem my_cert.pem: OK
Final step was combined my_cert.pem with l4.pem :
root@prod:~/cert# cat my_cert.pem l4.pem >> my_cert.crt
I copied my_cert.crt and my_cert.key to /etc/ssl/ and configured my nginx to use it. No more certificate error.

piątek, 11 kwietnia 2014

Solr configure logging to file

First download solr and commons-logging:
root@testing:# wget http://ftp.ps.pl/pub/apache/lucene/solr/4.7.1/solr-4.7.1.tgz root@testing:# wget http://ftp.piotrkosoft.net/pub/mirrors/ftp.apache.org//commons/logging/binaries/commons-logging-1.1.3-bin.tar.gz
Unpack both.
root@testing:# tar -zxf solr-4.7.1.tgz root@testing:# tar -zxf commons-logging-1.1.3-bin.tar.gz
Copy solr-4.7.1.war to temp directory i.e:
root@testing:# mkdir tmp root@testing:# cd tmp root@testing:tmp# cp ../solr-4.7.1/dist/solr-4.7.1.war .
Unpack solr*.war
root@testing:tmp# jar -xf solr-4.7.1.war
Copy from solr-4.7.1 log4j*.jar and slf4j*.jar to WEB-INF/lib/
root@testing:tmp# cp ../solr-4.7.1/example/lib/ext/{log4j-1.2.16.jar,slf4j*} WEB-INF/lib/
Copy from commons-logging commons-logging*.jar :
root@testing:tmp# cp ../commons-logging-1.1.3/commons-logging-1.1.3.jar WEB-INF/lib/
Create directory classess in WEB-INF/
root@testing:tmp# mkdir WEB-INF/classes
Copy from solr*/example/resources/ log4j.properties to WEB-INF/classes
root@testing:tmp# cp ../solr-4.7.1/example/resources/log4j.properties WEB-INF/classes/ root@testing:tmp# cat WEB-INF/classes/log4j.properties # Logging level solr.log=/var/log/solr log4j.rootLogger=INFO, file, CONSOLE log4j.appender.CONSOLE=org.apache.log4j.ConsoleAppender log4j.appender.CONSOLE.layout=org.apache.log4j.PatternLayout log4j.appender.CONSOLE.layout.ConversionPattern=%-4r [%t] %-5p %c %x \u2013 %m%n #- size rotation with log cleanup. log4j.appender.file=org.apache.log4j.RollingFileAppender log4j.appender.file.MaxFileSize=4MB log4j.appender.file.MaxBackupIndex=9 #- File to log to and log format log4j.appender.file.File=${solr.log}/solr.log log4j.appender.file.layout=org.apache.log4j.PatternLayout log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n log4j.logger.org.apache.zookeeper=WARN log4j.logger.org.apache.hadoop=WARN # set to INFO to enable infostream log messages log4j.logger.org.apache.solr.update.LoggingInfoStream=OFF
Create log directory which configured in log4j.properties and change owner to user which running tomcat
root@testing:tmp# mkdir /var/log/solr root@testing:tmp# chown tomcat7:tomcat7 /var/log/solr
Pack everything back to solr*.war
root@testing:tmp# jar -cf solr-4.7.1.war META-INF WEB-INF admin.html css favicon.ico img js tpl
If everything is OK you should enjoy logs in log directory:
root@testing:tmp# cat /var/log/solr/solr.log INFO - 2014-04-11 12:23:58.890; org.apache.solr.servlet.SolrDispatchFilter; SolrDispatchFilter.init() INFO - 2014-04-11 12:23:58.916; org.apache.solr.core.SolrResourceLoader; Using JNDI solr.home: /srv/solr INFO - 2014-04-11 12:23:58.919; org.apache.solr.core.SolrResourceLoader; new SolrResourceLoader for directory: '/srv/solr/' INFO - 2014-04-11 12:23:59.145; org.apache.solr.core.ConfigSolr; Loading container configuration from /srv/solr/solr.xml INFO - 2014-04-11 12:23:59.358; org.apache.solr.core.CorePropertiesLocator; Config-defined core root directory: /srv/solr INFO - 2014-04-11 12:23:59.372; org.apache.solr.core.CoreContainer; New CoreContainer 1409622974 INFO - 2014-04-11 12:23:59.372; org.apache.solr.core.CoreContainer; Loading cores into CoreContainer [instanceDir=/srv/solr/] INFO - 2014-04-11 12:23:59.398; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting socketTimeout to: 0 INFO - 2014-04-11 12:23:59.398; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting urlScheme to: null INFO - 2014-04-11 12:23:59.406; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting connTimeout to: 0 INFO - 2014-04-11 12:23:59.406; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting maxConnectionsPerHost to: 20 INFO - 2014-04-11 12:23:59.406; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting corePoolSize to: 0 INFO - 2014-04-11 12:23:59.407; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting maximumPoolSize to: 2147483647 INFO - 2014-04-11 12:23:59.407; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting maxThreadIdleTime to: 5 INFO - 2014-04-11 12:23:59.407; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting sizeOfQueue to: -1 INFO - 2014-04-11 12:23:59.407; org.apache.solr.handler.component.HttpShardHandlerFactory; Setting fairnessPolicy to: false INFO - 2014-04-11 12:23:59.588; org.apache.solr.logging.LogWatcher; SLF4J impl is org.slf4j.impl.Log4jLoggerFactory INFO - 2014-04-11 12:23:59.589; org.apache.solr.logging.LogWatcher; Registering Log Listener [Log4j (org.slf4j.impl.Log4jLoggerFactory)] INFO - 2014-04-11 12:23:59.591; org.apache.solr.core.CoreContainer; Host Name: INFO - 2014-04-11 12:23:59.677; org.apache.solr.core.CorePropertiesLocator; Looking for core definitions underneath /srv/solr INFO - 2014-04-11 12:23:59.678; org.apache.solr.core.CorePropertiesLocator; Found 0 core definitions INFO - 2014-04-11 12:23:59.680; org.apache.solr.servlet.SolrDispatchFilter; user.dir=/var/lib/tomcat7 INFO - 2014-04-11 12:23:59.680; org.apache.solr.servlet.SolrDispatchFilter; SolrDispatchFilter.init() done INFO - 2014-04-11 12:24:15.866; org.apache.solr.servlet.SolrDispatchFilter; [admin] webapp=null path=/admin/cores params={indexInfo=false&_=1397211686406&wt=json} status=0 QTime=69 INFO - 2014-04-11 12:24:15.941; org.apache.solr.servlet.SolrDispatchFilter; [admin] webapp=null path=/admin/info/system params={_=1397211686554&wt=json} status=0 QTime=16 INFO - 2014-04-11 12:24:20.785; org.apache.solr.servlet.SolrDispatchFilter; [admin] webapp=null path=/admin/info/logging params={_=1397211691410&since=0&wt=json} status=0 QTime=2 INFO - 2014-04-11 12:24:22.578; org.apache.solr.servlet.SolrDispatchFilter; [admin] webapp=null path=/admin/cores params={_=1397211693200&wt=json} status=0 QTime=0 INFO - 2014-04-11 12:24:24.230; org.apache.solr.servlet.SolrDispatchFilter; [admin] webapp=null path=/admin/info/properties params={_=1397211694858&wt=json} status=0 QTime=1

piątek, 21 marca 2014

Packt’s amazing Buy One, Get One Free

Check out #Packt’s amazing Buy One, Get One Free offer #Packt2k

piątek, 17 stycznia 2014

Secure web directory with exclude

Last time I needed to secure on of my websites. I set password, used htaccess and ...I got error on my main page. The reason was simple - main page invoked to file in my secure directory. I needed to exclude this file from my secure directory. Below is solution.
# password protection allowing multiple resources AuthType Basic AuthName "Restricted Area" AuthUserFile /home/path/.htpasswd AuthGroupFile /dev/null Require valid-user # allow public access to the following resources SetEnvIf Request_URI "(path/to/directory_01/)$" allow SetEnvIf Request_URI "(path/to/directory_02/)$" allow SetEnvIf Request_URI "(path/to/file\.php)$" allow SetEnvIf Request_URI "(path/to/file\.html)$" allow SetEnvIf Request_URI "(path/to/another/resource/)$" allow SetEnvIf Request_URI "(path/to/yet/another/resource/)$" allow Order allow,deny Allow from env=allow Satisfy any
Of course you can also use "Allow from " to limiting access to secure directory.