@greenfediverse @catgoat #CloudFlare is inherently energy wastefull. Bots & humans who use text-based browsers are the most environmentally respectful b/c they do not tend to download images (which consumes far more bandwidth than text). When images are not downloaded, CF's #Google #CAPTCHA automatically treats the connection like a hostile bot, & actually pushes images to an otherwise very green session,
@catgoat @greenfediverse then #CloudFlare denies access to the very connections that are most respectful of the environment.
@greenfediverse @catgoat #CloudFlare is inherently energy wastefull. Bots & humans who use text-based browsers are the most environmentally respectful b/c they do not tend to download images (which consumes far more bandwidth than text). When images are not downloaded, CF's #Google #CAPTCHA automatically treats the connection like a hostile bot, & actually pushes images to an otherwise very green session,
@LPS @greenfediverse a "green" project on Google-hosted #gitlab.com? Are you aware that Google sells machine learning to Total to dig for oil? Those Google CAPTCHAs aren't good for the environment either.
@hyperfocal @Tommy @ademalsasa there's a long list of decent ones, but nothing is better than Searxes (a specific searx node that filters out CF sites): https://ss.wodferndripvpe6ib4uz4rtngrnzichnirgn7t5x64gxcyroopbhsuqd.onion
@Horizon_Innovations indeed it does.. it collects any google cookies that are still active.
@codeberg @tobtobxx I'm glad Tor was not permanently blocked -- that would have been an overly crude attempt at a remedy. Restricting access to the kinds of email accounts that require mobile ph# reg. is also overly crude, and I hope that would be temporary until the server gets smarter about detecting & reacting to attack. Perhaps access controls need to be more refined.
@wend I don't follow. If you're presented with a CAPTCHA, then you've already been treated as a robot whether to solve it or not. From there, I personally suggest /not/ solving CAPTCHAs b/c that supports the CAPTCHA pushers. When you dance for them you give them power.
@tobtobxx vandalism still happens, ssh or not, but this can be controlled by way of access controls. Not to mention cleanup tools. E.g. if your inbox gets spammed the email firewalls isn't your only defense. There is SpamAssassin, and the possibility to extend Spamassassin's role beyond email.
@tobtobxx in principle there's no need for a web UI for github. repos can be serviced via SSH and to date there's been no need to CAPTCHA SSH users. Github chooses to make some functions exclusively available in their web UI (e.g. PRs), but that's their choice. And it's that choice by which their perceived need for CAPTCHA arises.
@tobtobxx Your premise in your question assumes there's a problem to solve. Can you describe the problem?
@kravietz @mister_monster that's #Google's excuse, but what about qwant and ecosia (who get their results from Bing)? I suppose for them it's about stopping ppl who avoid the ads.
@kravietz @mister_monster That's a fair point for some sites -- but search engines and airfare sites are also using #CAPTCHA even when there's no form to submit, which must be to block scraping.
@mister_monster @kravietz i don't really see the point in attacking on bots in the first place. Bots normally scrape text not images, and it's the images that consume a significant portion of bandwidth.
#Google's #CAPTCHA detects whether a user downloads images. If not, they're presumed to be a bot & get attacked with a puzzle. And yet it's all the *images* that strain the network in the 1st place, not the text. #crappyDesign
@ademalsasa @Tommy the default search for both of those browsers is #DuckDuckGo, and that's bad for #privacy. https://dev.lemmy.ml/post/31321