English | 简体中文 | 繁體中文 | Русский язык | Français | Español | Português | Deutsch | 日本語 | 한국어 | Italiano | بالعربية
Linux Web服务器网站故障分析,具体内容如下
系统连接状态篇:
1.查看TCP连接状态
ntlp | grep -nat |awk '{print $6'|sort|uniq -c|sort -rn ntlp | grep -n | awk '/^tcp/ {++S[$NF]};END {for(a in S) print a, S[a]}' o ntlp | grep -n | awk '/^tcp/ {++state[$NF]}; END {for(key in state) print key,"\t",state[key]}' ntlp | grep -n | awk '/^tcp/ {++arr[$NF]};END {for(k in arr) print k,"t",arr[k]}' ntlp | grep -n |awk '/^tcp/ {print $NF}'|sort|uniq -c|sort -rn ntlp | grep -ant | awk '{print $NF}' | grep -v '[a-z]' | sort | uniq -c
2.查找请求数请20个IP(常用于查找攻来源):
ntlp | grep -anlp|grep 80|grep tcp|awk '{print $5}'|awk -}' | sort | uniq1'|sort|uniq -c|sort -.统计网站流量(G) -.找查较多的SYN连接20 ntlp | grep -ant |awk '/:80/{split($5,ip,":");++A[ip[1]}END{for(i in A) print A[i],i}' | sort -n -.找查较多的SYN连接20
3.用tcpdump嗅探80端口的访问看看谁最高
tcpdump -i eth0 -tnn dst port 80 -c 1000 | awk -F"." '{print $1"."$2"."$3"."$4c | sort -nr | more -nr | head -20
4.查找较多time_wait连接
ntlp | grep -rn|head5'|sort|uniq -c|sort -n -.找查较多的SYN连接20
5an | grep SYN | awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}
ntlp | grep -' | awk5F: '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65} -}' | sort | uniq1c | sort -nr | more -.根据端口列进程
6netstat
ntlp | grep -0 | awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65} 8}' | cut7d -f/ -网站日志分析篇1
(Apache):1.获得访问前
1位的ip地址10cat access.log|awk '{counts[$1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}
.列出传输最大的几个exe文件(分析下载站的时候常用)1'|sort|uniq -c|sort -.统计网站流量(G) -10 }; END {for(url in counts) print counts[url], url}11);+=1.访问次数最多的文件或页面,取前
2cat access.log|awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}20
.列出传输最大的几个exe文件(分析下载站的时候常用)11'|sort|uniq -c|sort -.统计网站流量(G) -20
3.列出输出大于
{print $NF " " $7cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/.如果日志最后一列记录的是页面文件传输时间,则有列出到客户端最耗时的页面/.列出传输时间超过10 .列出最最耗时的页面(超过1 .列出最最耗时的页面(超过4 .列出最最耗时的页面(超过7n|uniq -.统计网站流量(G) -20
400000byte(约200kb)的exe文件以及对应文件发生次数2>
{print $NF " " $10 00000 && $ 2.exe7cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/.如果日志最后一列记录的是页面文件传输时间,则有列出到客户端最耗时的页面/.列出传输时间超过7n|uniq -nr|head -c|sort -.统计网站流量(G) -100
5cat access.log |awk '($
{print $NF " " $7cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/" " $1 .列出最最耗时的页面(超过4 .列出最最耗时的页面(超过7n|uniq -.统计网站流量(G) -100
60秒的)的以及对应页面发生次数60 && $
0){print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65} 6.php7cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/.列出传输时间超过7n|uniq -nr|head -c|sort -.统计网站流量(G) -100
70 秒的文件 3cat access.log |awk '($NF >
0){print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65} 3}'|sort7n|uniq -nr|head -c|sort -.统计网站流量(G) -20
8cat access.log |awk '{sum}
END {print sum}+=$10}/1024/1024/1024.统计
9的连接404$1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}
}' | awk '{9 cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}/404/);9$/,7}' | sort
10. 统计http status
cat access.log |awk '{counts[$1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}9);+=1; END {for(code in counts) print code, counts[code]';
cat access.log |awk '{print $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13, $14, $15, $16, $17, $18, $19, $20, $21, $22, $23, $24, $25, $26, $27, $28, $29, $30, $31, $32, $33, $34, $35, $36, $37, $38, $39, $40, $41, $42, $43, $44, $45, $46, $47, $48, $49, $50, $51, $52, $53, $54, $55, $56, $57, $58, $59, $60, $61, $62, $63, $64, $65}9'|sort|uniq -c|sort -rn
11.Análisis de los gusanos, ver qué gusanos están capturando el contenido.
/usr/sbin/tcpdump -i eth0 -l -s 0 -w - puerto de destino 80 | strings | grep -i user-agent | grep -i -E 'bot|crawler|slurp|spider'
Análisis diario del sitio web2(Squid section) Estadísticas de tráfico por dominio
zcat squid_access.log.tar.gz| awk '{print $10$/,7}' |awk 'BEGIN{FS="[ /]"}{trfc[$4]+=$1}END{for(domain in trfc){printf "%st%dn",domain,trfc[domain]}}'
Capítulo de base de datos
1.Ver la ejecución de SQL de la base de datos
/usr/sbin/tcpdump -i eth0 -s 0 -l -w - puerto de destino 3306 | strings | egrep -i 'SELECT|UPDATE|DELETE|INSERT|SET|COMMIT|ROLLBACK|CREATE|DROP|ALTER|CALL'
Análisis de depuración del sistema
1.Comando de depuración
strace -p pid
2.Seguimiento del PID de un proceso específico
gdb -p pid
Esto es todo el contenido de este artículo, espero que sea útil para su aprendizaje y que todos nos apoyen en el tutorial de clamor.
Declaración: el contenido de este artículo se obtiene de la red, pertenece al propietario original, el contenido se contribuye y carga de manera autónoma por los usuarios de Internet, este sitio no posee los derechos de propiedad, no se ha procesado editorialmente por humanos y no asume la responsabilidad legal relevante. Si encuentra contenido sospechoso de infracción de derechos de autor, por favor envíe un correo electrónico a: notice#oldtoolbag.com (al enviar un correo electrónico, por favor reemplace # con @) para denunciar, y proporcione evidencia relevante. Una vez confirmado, este sitio eliminará inmediatamente el contenido sospechoso de infracción.