Shortcuts: Difference between revisions
Jump to navigation
Jump to search
No edit summary |
No edit summary |
||
| Line 13: | Line 13: | ||
== .htaccess in /home/user/ == | == .htaccess in /home/user/ == | ||
find /home/*/ -maxdepth 1 -name .htaccess | find /home/*/ -maxdepth 1 -name .htaccess | ||
== Print only the rule IDs, hostnames, and URIs of ModSec violations, then sort them: == | |||
grep $IPADDRESS /usr/local/apache/logs/error_log | grep 'Tue Jul 28' | grep ModSec | awk -F '\\[line ' '{print $2}' | awk -F '\\[unique_id' '{print $1}' | awk '{print $2, $3, $(NF-3), $(NF-2), $(NF-1), $(NF)}' | sort | uniq -c | |||
= PHP = | = PHP = | ||
== Custom php.ini values on suphp sites | == Custom php.ini values on suphp sites == | ||
for i in $(find /home*/*/public_html -name .htaccess -not -name \*_vti_* -exec grep -iH suphp_ {} \; | awk -F" " '{ print $2"/php.ini" }' | sort | uniq); do echo $i; grep 'max_execution_time\|max_input_time\|memory_limit' $i; done | for i in $(find /home*/*/public_html -name .htaccess -not -name \*_vti_* -exec grep -iH suphp_ {} \; | awk -F" " '{ print $2"/php.ini" }' | sort | uniq); do echo $i; grep 'max_execution_time\|max_input_time\|memory_limit' $i; done | ||
Also, check for .htaccess files in the userdir, not just the docroot. | Also, check for .htaccess files in the userdir, not just the docroot. | ||
= Listing Files = | |||
== List files with numbers instead of usernames: == | |||
ls -l | awk '{print $3, $9}' | grep '^[0-9]' | |||
= LoadMon List = | |||
Old old school, but keeping this just in case: | |||
cd /root/loadMon && ls -lahtr | rev | cut -d' ' -f1 | rev | grep -v './' | |||
= LoadWatch = | |||
== Old school grep == | |||
grep -B 1 'Loadwatch tripped' /root/loadwatch/checklog | tail -n15 | |||
== New school == | |||
==== grep ==== | |||
grep '##' /var/log/loadwatch/check.log | tail | |||
==== Only double-digit or higher load averages ==== | |||
grep '##' /var/log/loadwatch/check.log | grep -E 'load\[[0-9]{2,}' | |||
==== Which cPanel accounts had non-FPM sites that were hit the most ==== | |||
grep php-cgi /var/log/loadwatch/2019-03-05.11.39.txt | awk '{print $NF}' | cut -d '/' -f3 | sort | uniq -c | sort -rn | |||
= Sed = | |||
== Find and replace a line: == | |||
sed -i 's/foo/bar/g' FILENAME | |||
== Search for terms at the beginning of each line with or without whitespace: == | |||
sed 's/^foo *= *1/bar=0/' | |||
== Append a new line after a search result: == | |||
sed '/RESULT/ a NEWLINE' | |||
== String multiple commands together: == | |||
sed -e 's/^foo *= *1/#foo=1/; /#foo=1/ a bar=0' | |||
== Replace // with / easily: == | |||
sed 's_//_/_g' | |||
== Delete all lines with --: == | |||
cat FILENAME | sed '/--/d' | |||
== Replace the first space in a line with an @ symbol: == | |||
sed 's/ /@/' | |||
== Grab a single line from a log file: == | |||
sed -n '<number>{p;q}' <file> | |||
So for line 46 in main.php: | |||
sed -n '46{p;q}' main.php | |||
== Copy a range of lines to a new file: == | |||
Revision as of 00:09, January 12, 2021
Which site error log was triggered?
for each in $(find /home/USER/public_html/ -type f -name "*error*log*"); do ls $each && tail -n1 $each && echo " "; done
Skip the first 3 columns, good for seeing errors in a given timeframe:
grep '12-Feb-2014 12:' /home/$USER/public_html/error_log | awk '{ $1=""; $2=""; $3=""; print $0 }' | sort | uniq -c
Another way to search by date:
for each in $(find /home/USER/public_html/ -type f -name "*error*log*"); do grep -H 2015 $each | tail -n1; done
Apache
Connections Going to WP Abuse
By Domain:
apachectl fullstatus | grep 'xmlrpc.php\|wp-login.php' | awk '{print substr($0, index($0,$12))}' | awk -F ":" '{print $1, $2}' | awk '{print $3, $5, $6}' | sort | uniq -c | sort -k2
By Raw Hits:
apachectl fullstatus | grep 'xmlrpc.php\|wp-login.php' | awk '{print substr($0, index($0,$12))}' | awk -F ":" '{print $1, $2}' | awk '{print $3, $5, $6}' | sort | uniq -c | sort -nr
.htaccess in /home/user/
find /home/*/ -maxdepth 1 -name .htaccess
Print only the rule IDs, hostnames, and URIs of ModSec violations, then sort them:
grep $IPADDRESS /usr/local/apache/logs/error_log | grep 'Tue Jul 28' | grep ModSec | awk -F '\\[line ' '{print $2}' | awk -F '\\[unique_id' '{print $1}' | awk '{print $2, $3, $(NF-3), $(NF-2), $(NF-1), $(NF)}' | sort | uniq -c
PHP
Custom php.ini values on suphp sites
for i in $(find /home*/*/public_html -name .htaccess -not -name \*_vti_* -exec grep -iH suphp_ {} \; | awk -F" " '{ print $2"/php.ini" }' | sort | uniq); do echo $i; grep 'max_execution_time\|max_input_time\|memory_limit' $i; done
Also, check for .htaccess files in the userdir, not just the docroot.
Listing Files
List files with numbers instead of usernames:
ls -l | awk '{print $3, $9}' | grep '^[0-9]'
LoadMon List
Old old school, but keeping this just in case:
cd /root/loadMon && ls -lahtr | rev | cut -d' ' -f1 | rev | grep -v './'
LoadWatch
Old school grep
grep -B 1 'Loadwatch tripped' /root/loadwatch/checklog | tail -n15
New school
grep
grep '##' /var/log/loadwatch/check.log | tail
Only double-digit or higher load averages
grep '##' /var/log/loadwatch/check.log | grep -E 'load\[[0-9]{2,}'
Which cPanel accounts had non-FPM sites that were hit the most
grep php-cgi /var/log/loadwatch/2019-03-05.11.39.txt | awk '{print $NF}' | cut -d '/' -f3 | sort | uniq -c | sort -rn
Sed
Find and replace a line:
sed -i 's/foo/bar/g' FILENAME
Search for terms at the beginning of each line with or without whitespace:
sed 's/^foo *= *1/bar=0/'
Append a new line after a search result:
sed '/RESULT/ a NEWLINE'
String multiple commands together:
sed -e 's/^foo *= *1/#foo=1/; /#foo=1/ a bar=0'
Replace // with / easily:
sed 's_//_/_g'
Delete all lines with --:
cat FILENAME | sed '/--/d'
Replace the first space in a line with an @ symbol:
sed 's/ /@/'
Grab a single line from a log file:
sed -n '<number>{p;q}' <file>
So for line 46 in main.php:
sed -n '46{p;q}' main.php