Any time that you want to use a large number of high quality proxies for adding sneakers to a cart, you’ll most likely be using a sneaker bot to do most of your work for you. Like everyone who “cooks” or uses sneaker bots, you don’t want to get your proxies banned while they are in the middle of checking out sneakers for you. If that should happen, you’ll need a new proxy IP and you’ve already lost your spot in line. Once you discover the error, you’ll have to frantically set up new proxies to finish adding your sneakers to a cart, and by that time your favorite Air Jordan release could have already ended.
This isn’t always what happens, but it’s undeniable that it is terribly frustrating when the IP of your proxy gets banned right while you are checking out or adding sneakers to a cart. At best, it wastes time, forces you to spend more resources than you should have to repairing the damage and prevents your task from being conducted smoothly. That’s why you should consider taking steps to make sure your IPs don’t get banned before you start your sneaker bot’s task.
In order to get an understanding of how you can avoid bans on your IPs, it’s necessary to comprehend how sites are able to detect a proxy IP. The items in the following list are commonly identified by websites as red flags:
• Numerous queries coming in from an irrelevant geolocation
• Numerous identical queries coming in at once
• Numerous queries coming in from the same web browser
• Numerous queries coming using high risk terms
While the items in the list above are what most commonly cause servers to flag an IP, they could also occur during the course of your work when you’re not even using a proxy. For example, if you’re interested in scanning the first 20 pages of Bing search results to conduct a study on the titles of blog posts concerning a particular search phrase on one website, you’d most likely use the site: operator. What some people don’t know, however, is that doing so could trigger multiple captchas and a failure could cause your IP to be blocked. Easy steps to avoid getting flagged while performing your research have been listed below.
Be Careful About Using High Risk Geolocations
IP addresses are virtual addresses. They are used to identify the origin of the incoming connection. Anyone with a basic knowledge of programming can use the IP address to determine what country the user is coming from.
Proxies are used to filter the IP address. The way a proxy works is by operating as a conduit in the middle of your line of communication. For example, if you are in Nevada trying to send a connection to London using a proxy IP based in Baghdad, the server will perceive the incoming traffic as originating in Iraq. Because they are unable to access information beyond the server, the server will be unable to see that you are actually in the United States.
Most servers are programmed to recognize high-risk countries such as Iraq as foreign traffic locations and to flag them as a warning sign for potential fraud. If you’ve ever happened to receive a phone call from someone with a curious accent swearing he’s from your credit card company, you know exactly how big of an issue foreign electronic communication can be.
The easiest way to resolve this sort of problem is to simply make sure to use a high quality proxy that originates in a tier 1 country, or a country in the same region as the site you’re connecting to. Try to use proxies from Western Europe or North America if you’re checking out on the US Nike, Adidas or Supreme website.
Establish a Unique User Agent for Each of Your IPs
The user agent is a header for your data string that is included with all communications from your computer to the server of the site you are visiting. Typically, all that is included is the version of the operating system your computer runs, the language and the edition of the browser that you use.
Although this information is anonymous, it is still an identifying piece of information. If Bing notices that 20 searches are performed in a second from the same version of a web browser, looking for identical information, it will make the very reasonable assumption that the 20 queries are each part of one query performed by 20 bots.
The user agent information may vary, depending on your connection and your bot’s features. If supported, you should be able to manually configure each proxy to use a different user agent, which will help to make the traffic between your computer and the server appear to be organic.
Set Up a Native Referrer Source
Referrer is a different type of information, but still gives identifying marks to the server you’re connecting to.
A referrer is actually where the server you are visiting believes you are coming from. If you open up a new tab in your web browser, enter “www.bing.com” and press enter, this will appear as direct traffic without a referrer. This will work okay for just homepages, but with queries it can be a different story. If you type in a full string, it is very unlikely. Bing would be expecting you to be coming from your homepage when you arrive at the homepage, so if it shows up as direct traffic it will most likely be flagged with a warning sign. In a similar fashion, if you are scraping data from Walmart, the server will be expecting you to have a referral from Walmart, not direct traffic.
If by some stroke of bad luck your referrer gets set up as another site, even if it’s your own web page, the server on the page you are trying to access will be able to scan a large selection of different queries flooding in, all referred by one site. This is distinctly traffic driven by bots, and it will undoubtedly be blocked.
The easiest way to get around this problem is to set the referrer to native and make sure that it is related to the location that you are querying. If you are trying to send a lot of traffic to several search pages on Bing, you need to set the referrer as Bing to avoid being flagged.
Establish a Rate Limit on Requests
One of the most common reasons that proxies are blocked is because of a rate limit not being set up. One little known fact about the internet is that most websites don’t have a problem with bots in general. In fact, the search spiders for search engines such as Bing and Google are bots. Bots are commonly used to search for breaches in website security and for expediting the clicking of links and the browsing of content.
Bots only start to become a problem that the web server wants to handle when they start to tamper with the site. Bots trying to repeatedly login to a website are a common reason that a web server would raise a red flag. This is where rate limits come in. When bots are trying to make dozens of requests per second, it is either trying to accomplish a task very quickly or many tasks in rapid succession. Legitimate humans don’t typically cycle through requests this quickly, so web servers are trained to block this type of activity.
When you’re adding limited release sneakers to a cart, you want to get it done as fast as you can because the more time you spend waiting, the less chance you’ll have at snagging a the limited release sneakers before they’re sold out. What’s happening more and more, is that sites like Adidas, Nike or Supreme will “ghost ban” the faster datacenter proxies, while leaving the slower residential proxies unscathed. When you establish a rate limit on your requests, the server sees that even if the requests look like they are coming from a bot, you are not working with malicious intent. In fact, maybe you aren’t even a bot at all. Keeping the server guessing at who you are helps to keep your proxies from being blocked.
Run All of Your Requests Asynchronously
If you are trying to make 100 queries with 10 bots working on 10 Supreme proxies, Nike proxies, Adidas proxies and sneaker proxies, your first thought might be to have each bot send one query per second so that the process can be over as soon as possible. The problem with this approach, however, is that the server will see this as 10 almost identical queries coming in for 10 seconds in a row, which is highly suspicious behavior. Normal persons (those you are attempting to imitate) browse from one link to the next, not all at once. Staggering the queries so that there is a one or two second delay between each one will help to break up the pattern and fool the web server.
How to Choose Between Residential Sneaker Proxies and Datacenter Sneaker Proxies
As someone who frequently uses proxies, you are probably wondering whether you should use residential proxies for sneaker add to cart (ATC) or datacenter proxies. There’s an easy way to look at it, and a complicated one.
Residential proxies are simply IP addresses issued from a standard Internet Service Provider, typically cable or DSL that is wired right into your home. Datacenter proxies are IP addresses issued by a secondary corporation not in your home.
With residential proxies, each time you connect to the internet, the web server will be able to identify your location, a map thereof and the name of your ISP provider. Datacenter IP addresses are not associated with an ISP and are unable to get you internet access. Basically, they work like a proxy because they modify or hide the IP address the server sees.
There are many different reasons why someone might want to use residential proxies, but by far the most common is for anonymity. If you use Verizon, for example, as your ISP, anyone in the world can look at your IP address and get a pretty good guess of where you live, the specific address of your computer and which ISP you subscribe to.
Suppose you want to use proxies to add sneakers to a cart on a site such as Nike or Adidas. If you tried to use datacenter proxies, the website would be able to instantly recognize that an atypical access is being attempted from a source that is not residential. This is an immediate red flag and could get you blocked instantly. If you use a residential proxy, however, the server will perceive you access to be legitimate.
It all boils down to the fact the residential IP proxies are overall safer than datacenter proxies. Residential proxies can be seen as a go to for those concerned about staying anonymous, undetected, and want to ensure a safe checkout on online shops like Nike, Adidas or Supreme. If it turns out that you really must use a datacenter proxy, try to make sure that the company you use is trustworthy and they offer a diverse range of c-class IPs to help avoid detection.
Hi Kenny, I’m Ronald and i stay at Asia. But i want to cop at supreme us, so i used g cloud server which from us. And Us proxy. And i always get block problems on my cyber bot. So how can u solve that problems? Someone tell me supreme banned g cloud another tell me is because of the proxy? Can you help me? Thanks for watching this asking.
Hi Ronald,
Using anonymous private proxies is important for being able to browse Supreme and sneaker sites alike. If your proxy is private and anonymous, the server you’re using the proxies on should not matter. In this scenario, the server is being used mainly to increase speeds. The proxies increase your privacy so you can avoid blocks when browsing. So, I’m going to guess that your proxies were causing the block issue.
If you’re looking for unblocked private / dedicated proxies, we can help! Drop us a line here: https://rotatingproxies.com/contact
Hi Mate, I want to run multiple amazon accounts, I have tried with residential Proxies on Multilogin App, but all accounts were suspended for first order, but amazon accounts are running well on some residential connection VPS, Just yesterday I found one thing, there is no real residential connection VPS in the market, everyone using Datacenter VPS and configuring with Backconnet static proxies, is it true? How could I run my accounts without being suspended, Please help me out these things? in which base Amazon will suspend accounts? only IP or any background lookup?
Note:: Running accounts for orders only
Hi Himmu, thank you for reaching out! We recommend using dedicated proxies for managing accounts, such as Premium Dedicated Residential Proxies.
You can find even more information on scraping Amazon in this handy blog post: The Top 5 Guidelines for Scraping Amazon Safely