• Welcome to the world's largest Chinese hacker forum

    Welcome to the world's largest Chinese hacker forum, our forum registration is open! You can now register for technical communication with us, this is a free and open to the world of the BBS, we founded the purpose for the study of network security, please don't release business of black/grey, or on the BBS posts, to seek help hacker if violations, we will permanently frozen your IP and account, thank you for your cooperation. Hacker attack and defense cracking or network Security

    business please click here: Creation Security  From CNHACKTEAM

2021年实现自动化批量扫洞(仅限安全测试)


H4CK

Recommended Posts

AWVS是一款自动化应用程序安全测试工具,支持windows平台,主要用于扫描web应用程序上的安全问题,如SQL注入,XSS,目录遍历,命令注入等

xray (https://github.com/chaitin/xray) 是从长亭洞鉴核心引擎中提取出的社区版漏洞扫描神器,支持主动、被动多种扫描方式,自备盲打平台、可以灵活定义 POC,功能丰富,调用简单,支持 Windows / macOS / Linux 多种操作系统,可以满足广大安全从业者的自动化 Web 漏洞探测需求。

awvs 的爬虫很好用,支持表单分析和单页应用的爬取,xray 的扫描能力比较强,速度也更快。

这里简单说明一下如何实现二者联动自动批量扫描漏洞,过程比较简单,xray官方文档里也有介绍,awvs批量扫描脚本也是直接拿的网上某位表哥的(感谢表哥的脚本~),我只是记录一下学习的过程。

首先打开xray的被动代理

 
xray.exe webscan --listen 127.0.0.1:7777 --html-output report.html
 

这条命令监听本地7777端口,然后以html格式将扫描结果导出,文件名为report.html

xray%E5%BC%80%E5%90%AF-1024x562.png

这里先说一下手动添加的情况

打开AWVS,添加目标后添加代理服务器127.0.0.1:7777

这里目标就添加AWVS提供的测试站点 http://testphp.vulnweb.com/

awvs-2-1024x549.png

开始扫描,此时xray已经自动开始扫描了

xray%E6%89%AB%E6%8F%8F-1024x562.png

当遇到目标很多的情况,可以选择批量添加,但AWVS自带的导入功能一次只能上传100个目标,而且经常出现无法上传的问题(也可能只有我这样?),好在网上已经有表哥提供了批量添加的脚本(再次感谢表哥的脚本),稍作修改即可体验飞一般的感觉。(脚本放在文末)

首先apikey需要替换成自己的,需要登录AWVS的web端,在右上角配置里面可以找到,不手动生成新的api密钥的话就可以一直用同一个,这里不做赘述

apikey-1024x654.png

第二处是最后的一些配置信息

proxy_address和port修改为xray监听的地址和端口

awvs_url为AWVS的地址

user_agent为自定义的请求头,这里图省事直接用默认的,其他的可自行百度

profile_id表示AWVS扫描的模式,我一般喜欢用Full Scan(全扫描)或Crawl Only(仅爬虫),其他的规则如下

类型 profile_id
Full Scan 11111111-1111-1111-1111-111111111111
High Risk Vulnerabilities 11111111-1111-1111-1111-111111111112
SQL Injection Vulnerabilities 11111111-1111-1111-1111-111111111113
Weak Passwords 11111111-1111-1111-1111-111111111115
Cross-site Scripting Vulnerabilities 11111111-1111-1111-1111-111111111116
Crawl Only 11111111-1111-1111-1111-111111111117
Malware Scan 11111111-1111-1111-1111-111111111120

%E4%BF%AE%E6%94%B92-1024x654.png

一切准备完成后,在同目录创建url.txt,将目标放入文本,这里随便添加两个

image.png

添加完成后,运行脚本即可

image-1-1024x535.png

此时目标已经自动添加到AWVS,实现AWVS与Xray联动批量自动扫洞

image-2-1024x549.png

awvs扫描结果需要登陆web查看,xray扫描结果在同目录下指定的文件中

xray扫描结果

image-3-1024x501.png

附上表哥的批量脚本:


import requests

import json

from requests.packages.urllib3.exceptions import InsecureRequestWarning

requests.packages.urllib3.disable_warnings(InsecureRequestWarning)





apikey = '1986ad8c0a5b3df4d7028d5f3c06e936c1e3809ff358046e9b85e30ceb7dbea20'#API

headers = {'Content-Type': 'application/json',"X-Auth": apikey}





def addTask(url,target):

try:

url = ''.join((url, '/api/v1/targets/add'))

data = {"targets":[{"address": target,"description":""}],"groups":[]}

r = requests.post(url, headers=headers, data=json.dumps(data), timeout=30, verify=False)

result = json.loads(r.content.decode())

return result['targets'][0]['target_id']

except Exception as e:

return e

def scan(url,target,Crawl,user_agent,profile_id,proxy_address,proxy_port):

scanUrl = ''.join((url, '/api/v1/scans'))

target_id = addTask(url,target)



if target_id:

data = {"target_id": target_id, "profile_id": profile_id, "incremental": False, "schedule": {"disable": False, "start_date": None, "time_sensitive": False}}

try:

configuration(url,target_id,proxy_address,proxy_port,Crawl,user_agent)

response = requests.post(scanUrl, data=json.dumps(data), headers=headers, timeout=30, verify=False)

result = json.loads(response.content)

return result['target_id']

except Exception as e:

print(e)



def configuration(url,target_id,proxy_address,proxy_port,Crawl,user_agent):

configuration_url = ''.join((url,'/api/v1/targets/{0}/configuration'.format(target_id)))

data = {"scan_speed":"fast","login":{"kind":"none"},"ssh_credentials":{"kind":"none"},"sensor": False,"user_agent": user_agent,"case_sensitive":"auto","limit_crawler_scope": True,"excluded_paths":[],"authentication":{"enabled": False},"proxy":{"enabled": Crawl,"protocol":"http","address":proxy_address,"port":proxy_port},"technologies":[],"custom_headers":[],"custom_cookies":[],"debug":False,"client_certificate_password":"","issue_tracker_id":"","excluded_hours_id":""}

r = requests.patch(url=configuration_url,data=json.dumps(data), headers=headers, timeout=30, verify=False)

def main():

Crawl = True

proxy_address = '127.0.0.1'

proxy_port = '7777'

awvs_url = 'https://127.0.0.1:3443' #awvs url

with open('url.txt','r',encoding='utf-8') as f:

targets = f.readlines()

profile_id = "11111111-1111-1111-1111-111111111111"

user_agent = "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.21 (KHTML, like Gecko) Chrome/41.0.2228.0 Safari/537.21" #扫描默认UA

if Crawl:

profile_id = "11111111-1111-1111-1111-111111111111"

for target in targets:

target = target.strip()

if scan(awvs_url,target,Crawl,user_agent,profile_id,proxy_address,int(proxy_port)):

print("{0} 添加成功".format(target))



if __name__ == '__main__':

main()
除此之外,还有很多方便且实用的联动,比如burpsuite+xray、fofa+xray,360爬虫+xray、goby+awvs+xray等、原理大都相同,也可以联动server酱自动告警,实现“漏洞已到账”的信息自动推送到VX上( 可能是我环境问题,至今未能实现,有兴趣可以查看官方文档 https://docs.xray.cool/#/scenario/xray_vuln_alert )


 
8FD7AEFF6AFE323396C4ED512925CB90-1024x93

写在最后!

本站所有技术文章, 仅用于经验技术交流学习,禁止应用到不正当途径,因滥用技术产生的风险与本人无关!

务必不要对未授权的站点进行扫描测试,否则。。。

A5F37EA7D7F686BE23348301715F37CE.jpg
Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now