Proxy server

Last updated

Communication between two computers connected through a third computer acting as a proxy server. This can protect Alice's privacy, as Bob only knows about the proxy and cannot identify or contact Alice directly. Proxy concept en.svg
Communication between two computers connected through a third computer acting as a proxy server. This can protect Alice's privacy, as Bob only knows about the proxy and cannot identify or contact Alice directly.

In computer networking, a proxy server is a server application that acts as an intermediary between a client requesting a resource and the server providing that resource. [1] It improves privacy, security, and performance in the process.

Contents

Instead of connecting directly to a server that can fulfill a request for a resource, such as a file or web page, the client directs the request to the proxy server, which evaluates the request and performs the required network transactions. This serves as a method to simplify or control the complexity of the request, or provide additional benefits such as load balancing, privacy, or security. Proxies were devised to add structure and encapsulation to distributed systems. [2] A proxy server thus functions on behalf of the client when requesting service, potentially masking the true origin of the request to the resource server.

Types

A proxy server may reside on the user's local computer, or at any point between the user's computer and destination servers on the Internet. A proxy server that passes unmodified requests and responses is usually called a gateway or sometimes a tunneling proxy. A forward proxy is an Internet-facing proxy used to retrieve data from a wide range of sources (in most cases, anywhere on the Internet). A reverse proxy is usually an internal-facing proxy used as a front-end to control and protect access to a server on a private network. A reverse proxy commonly also performs tasks such as load-balancing, authentication, decryption and caching. [3]

Open proxies

An open proxy forwarding requests from and to anywhere on the Internet Open proxy h2g2bob.svg
An open proxy forwarding requests from and to anywhere on the Internet

An open proxy is a forwarding proxy server that is accessible by any Internet user. In 2008, network security expert Gordon Lyon estimated that "hundreds of thousands" of open proxies are operated on the Internet. [4]

Reverse proxies

A reverse proxy taking requests from the Internet and forwarding them to servers in an internal network. Those making requests connect to the proxy and may not be aware of the internal network. Reverse proxy h2g2bob.svg
A reverse proxy taking requests from the Internet and forwarding them to servers in an internal network. Those making requests connect to the proxy and may not be aware of the internal network.

A reverse proxy (or surrogate) is a proxy server that appears to clients to be an ordinary server. Reverse proxies forward requests to one or more ordinary servers that handle the request. The response from the original server is returned as if it came directly from the proxy server, leaving the client with no knowledge of the original server. [5] Reverse proxies are installed in the vicinity of one or more web servers. All traffic coming from the Internet and with a destination of one of the neighborhood's web servers goes through the proxy server. The use of "reverse" originates in its counterpart "forward proxy" since the reverse proxy sits closer to the web server and serves only a restricted set of websites. There are several reasons for installing reverse proxy servers:

Uses

Monitoring and filtering

Content-control software

A content-filtering web proxy server provides administrative control over the content that may be relayed in one or both directions through the proxy. It is commonly used in both commercial and non-commercial organizations (especially schools) to ensure that Internet usage conforms to acceptable use policy.

Content filtering proxy servers will often support user authentication to control web access. It also usually produces logs, either to give detailed information about the URLs accessed by specific users or to monitor bandwidth usage statistics. It may also communicate to daemon-based and/or ICAP-based antivirus software to provide security against viruses and other malware by scanning incoming content in real-time before it enters the network.

Many workplaces, schools, and colleges restrict web sites and online services that are accessible and available in their buildings. Governments also censor undesirable content. This is done either with a specialized proxy, called a content filter (both commercial and free products are available), or by using a cache-extension protocol such as ICAP, that allows plug-in extensions to an open caching architecture.

Websites commonly used by students to circumvent filters and access blocked content often include a proxy, from which the user can then access the websites that the filter is trying to block.

Requests may be filtered by several methods, such as a URL or DNS blacklists, URL regex filtering, MIME filtering, or content keyword filtering. Blacklists are often provided and maintained by web-filtering companies, often grouped into categories (pornography, gambling, shopping, social networks, etc..).

The proxy then fetches the content, assuming the requested URL is acceptable. At this point, a dynamic filter may be applied on the return path. For example, JPEG files could be blocked based on fleshtone matches, or language filters could dynamically detect unwanted language. If the content is rejected then an HTTP fetch error may be returned to the requester.

Most web filtering companies use an internet-wide crawling robot that assesses the likelihood that content is a certain type. Manual labor is used to correct the resultant database based on complaints or known flaws in the content-matching algorithms. [6]

Some proxies scan outbound content, e.g., for data loss prevention; or scan content for malicious software.

Filtering of encrypted data

Web filtering proxies are not able to peer inside secure sockets HTTP transactions, assuming the chain-of-trust of SSL/TLS (Transport Layer Security) has not been tampered with. The SSL/TLS chain-of-trust relies on trusted root certificate authorities.

In a workplace setting where the client is managed by the organization, devices may be configured to trust a root certificate whose private key is known to the proxy. In such situations, proxy analysis of the contents of an SSL/TLS transaction becomes possible. The proxy is effectively operating a man-in-the-middle attack, allowed by the client's trust of a root certificate the proxy owns.

Bypassing filters and censorship

If the destination server filters content based on the origin of the request, the use of a proxy can circumvent this filter. For example, a server using IP-based geolocation to restrict its service to a certain country can be accessed using a proxy located in that country to access the service. [7] :3

Web proxies are the most common means of bypassing government censorship, although no more than 3% of Internet users use any circumvention tools. [7] :7

Some proxy service providers allow businesses access to their proxy network for rerouting traffic for business intelligence purposes. [8]

In some cases, users can circumvent proxies that filter using blacklists by using services designed to proxy information from a non-blacklisted location. [9]

Many organizations block access to popular websites such as Facebook. Users can use proxy servers to circumvent this security. However, by connecting to proxy servers, they might be opening themselves up to danger by passing sensitive information such as personal photos and passwords through the proxy server. This image illustrates a common example: schools blocking websites to students. CPT-Proxy.svg
Many organizations block access to popular websites such as Facebook. Users can use proxy servers to circumvent this security. However, by connecting to proxy servers, they might be opening themselves up to danger by passing sensitive information such as personal photos and passwords through the proxy server. This image illustrates a common example: schools blocking websites to students.

Logging and eavesdropping

Proxies can be installed in order to eavesdrop upon the data-flow between client machines and the web. All content sent or accessed – including passwords submitted and cookies used – can be captured and analyzed by the proxy operator. For this reason, passwords to online services (such as webmail and banking) should always be exchanged over a cryptographically secured connection, such as SSL. By chaining the proxies which do not reveal data about the original requester, it is possible to obfuscate activities from the eyes of the user's destination. However, more traces will be left on the intermediate hops, which could be used or offered up to trace the user's activities. If the policies and administrators of these other proxies are unknown, the user may fall victim to a false sense of security just because those details are out of sight and mind. In what is more of an inconvenience than a risk, proxy users may find themselves being blocked from certain Web sites, as numerous forums and Web sites block IP addresses from proxies known to have spammed or trolled the site. Proxy bouncing can be used to maintain privacy.

Improving performance

A caching proxy server accelerates service requests by retrieving the content saved from a previous request made by the same client or even other clients. [10] Caching proxies keep local copies of frequently requested resources, allowing large organizations to significantly reduce their upstream bandwidth usage and costs, while significantly increasing performance. Most ISPs and large businesses have a caching proxy. Caching proxies were the first kind of proxy server. Web proxies are commonly used to cache web pages from a web server. [11] Poorly implemented caching proxies can cause problems, such as an inability to use user authentication. [12]

A proxy that is designed to mitigate specific link related issues or degradation is a Performance Enhancing Proxy (PEPs). These are typically used to improve TCP performance in the presence of high round-trip times or high packet loss (such as wireless or mobile phone networks); or highly asymmetric links featuring very different upload and download rates. PEPs can make more efficient use of the network, for example, by merging TCP ACKs (acknowledgements) or compressing data sent at the application layer. [13]

Translation

A translation proxy is a proxy server that is used to localize a website experience for different markets. Traffic from the global audience is routed through the translation proxy to the source website. As visitors browse the proxied site, requests go back to the source site where pages are rendered. The original language content in the response is replaced by the translated content as it passes back through the proxy. The translations used in a translation proxy can be either machine translation, human translation, or a combination of machine and human translation. Different translation proxy implementations have different capabilities. Some allow further customization of the source site for the local audiences such as excluding the source content or substituting the source content with the original local content.

Accessing services anonymously

An anonymous proxy server (sometimes called a web proxy) generally attempts to anonymize web surfing. Anonymizers may be differentiated into several varieties. The destination server (the server that ultimately satisfies the web request) receives requests from the anonymizing proxy server and thus does not receive information about the end user's address. The requests are not anonymous to the anonymizing proxy server, however, and so a degree of trust is present between the proxy server and the user. Many proxy servers are funded through a continued advertising link to the user.

Access control: Some proxy servers implement a logon requirement. In large organizations, authorized users must log on to gain access to the web. The organization can thereby track usage to individuals. Some anonymizing proxy servers may forward data packets with header lines such as HTTP_VIA, HTTP_X_FORWARDED_FOR, or HTTP_FORWARDED, which may reveal the IP address of the client. Other anonymizing proxy servers, known as elite or high-anonymity proxies, make it appear that the proxy server is the client. A website could still suspect a proxy is being used if the client sends packets that include a cookie from a previous visit that did not use the high-anonymity proxy server. Clearing cookies, and possibly the cache, would solve this problem.

QA geotargeted advertising

Advertisers use proxy servers for validating, checking and quality assurance of geotargeted ads. A geotargeting ad server checks the request source IP address and uses a geo-IP database to determine the geographic source of requests. [14] Using a proxy server that is physically located inside a specific country or a city gives advertisers the ability to test geotargeted ads.

Security

A proxy can keep the internal network structure of a company secret by using network address translation, which can help the security of the internal network. [15] This makes requests from machines and users on the local network anonymous. Proxies can also be combined with firewalls.

An incorrectly configured proxy can provide access to a network otherwise isolated from the Internet. [4]

Cross-domain resources

Proxies allow web sites to make web requests to externally hosted resources (e.g. images, music files, etc.) when cross-domain restrictions prohibit the web site from linking directly to the outside domains. Proxies also allow the browser to make web requests to externally hosted content on behalf of a website when cross-domain restrictions (in place to protect websites from the likes of data theft) prohibit the browser from directly accessing the outside domains.

Malicious usages

Secondary market brokers

Secondary market brokers use web proxy servers to circumvent restrictions on online purchase of limited products such as limited sneakers [16] or tickets.

Implementations of proxies

Web proxy servers

Web proxies forward HTTP requests. The request from the client is the same as a regular HTTP request except the full URL is passed, instead of just the path. [17]

GEThttps://en.wikipedia.org/wiki/Proxy_serverHTTP/1.1Proxy-Authorization:Basic encoded-credentialsAccept:text/html

This request is sent to the proxy server, the proxy makes the request specified and returns the response.

HTTP/1.1200OKContent-Type:text/html; charset UTF-8

Some web proxies allow the HTTP CONNECT method to set up forwarding of arbitrary data through the connection; a common policy is to only forward port 443 to allow HTTPS traffic.

Examples of web proxy servers include Apache (with mod_proxy or Traffic Server), HAProxy, IIS configured as proxy (e.g., with Application Request Routing), Nginx, Privoxy, Squid, Varnish (reverse proxy only), WinGate, Ziproxy, Tinyproxy, RabbIT and Polipo.

For clients, the problem of complex or multiple proxy-servers is solved by a client-server Proxy auto-config protocol (PAC file).

SOCKS proxy

SOCKS also forwards arbitrary data after a connection phase, and is similar to HTTP CONNECT in web proxies.

Transparent proxy

Also known as an intercepting proxy, inline proxy, or forced proxy, a transparent proxy intercepts normal application layer communication without requiring any special client configuration. Clients need not be aware of the existence of the proxy. A transparent proxy is normally located between the client and the Internet, with the proxy performing some of the functions of a gateway or router. [18]

RFC   2616 (Hypertext Transfer Protocol—HTTP/1.1) offers standard definitions:

"A 'transparent proxy' is a proxy that does not modify the request or response beyond what is required for proxy authentication and identification". "A 'non-transparent proxy' is a proxy that modifies the request or response in order to provide some added service to the user agent, such as group annotation services, media type transformation, protocol reduction, or anonymity filtering".

TCP Intercept is a traffic filtering security feature that protects TCP servers from TCP SYN flood attacks, which are a type of denial-of-service attack. TCP Intercept is available for IP traffic only.

In 2009 a security flaw in the way that transparent proxies operate was published by Robert Auger, [19] and the Computer Emergency Response Team issued an advisory listing dozens of affected transparent and intercepting proxy servers. [20]

Purpose

Intercepting proxies are commonly used in businesses to enforce acceptable use policies and to ease administrative overheads since no client browser configuration is required. This second reason, however is mitigated by features such as Active Directory group policy, or DHCP and automatic proxy detection.

Intercepting proxies are also commonly used by ISPs in some countries to save upstream bandwidth and improve customer response times by caching. This is more common in countries where bandwidth is more limited (e.g. island nations) or must be paid for.

Issues

The diversion or interception of a TCP connection creates several issues. First, the original destination IP and port must somehow be communicated to the proxy. This is not always possible (e.g., where the gateway and proxy reside on different hosts). There is a class of cross-site attacks that depend on certain behaviors of intercepting proxies that do not check or have access to information about the original (intercepted) destination. This problem may be resolved by using an integrated packet-level and application level appliance or software which is then able to communicate this information between the packet handler and the proxy.

Intercepting also creates problems for HTTP authentication, especially connection-oriented authentication such as NTLM, as the client browser believes it is talking to a server rather than a proxy. This can cause problems where an intercepting proxy requires authentication, and then the user connects to a site that also requires authentication.

Finally, intercepting connections can cause problems for HTTP caches, as some requests and responses become uncacheable by a shared cache.

Implementation methods

In integrated firewall/proxy servers where the router/firewall is on the same host as the proxy, communicating original destination information can be done by any method, for example Microsoft TMG or WinGate.

Interception can also be performed using Cisco's WCCP (Web Cache Control Protocol). This proprietary protocol resides on the router and is configured from the cache, allowing the cache to determine what ports and traffic is sent to it via transparent redirection from the router. This redirection can occur in one of two ways: GRE tunneling (OSI Layer 3) or MAC rewrites (OSI Layer 2).

Once traffic reaches the proxy machine itself, interception is commonly performed with NAT (Network Address Translation). Such setups are invisible to the client browser, but leave the proxy visible to the web server and other devices on the internet side of the proxy. Recent Linux and some BSD releases provide TPROXY (transparent proxy) which performs IP-level (OSI Layer 3) transparent interception and spoofing of outbound traffic, hiding the proxy IP address from other network devices.

Detection

Several methods may be used to detect the presence of an intercepting proxy server:

  • By comparing the client's external IP address to the address seen by an external web server, or sometimes by examining the HTTP headers received by a server. A number of sites have been created to address this issue, by reporting the user's IP address as seen by the site back to the user on a web page. Google also returns the IP address as seen by the page if the user searches for "IP".
  • By comparing the results of online IP checkers when accessed using HTTPS vs. HTTP, as most intercepting proxies do not intercept SSL. If there is suspicion of SSL being intercepted, one can examine the certificate associated with any secure web site, the root certificate should indicate whether it was issued for the purpose of intercepting.
  • By comparing the sequence of network hops reported by a tool such as traceroute for a proxied protocol such as HTTP (port 80) with that for a non-proxied protocol such as SMTP (port 25). [21]
  • By attempting to make a connection to an IP address at which there is known to be no server. The proxy will accept the connection and then attempt to proxy it on. When the proxy finds no server to accept the connection, it may return an error message or simply close the connection to the client. This difference in behavior is simple to detect. For example, most web browsers will generate a browser created error page in the case where they cannot connect to an HTTP server but will return a different error in the case where the connection is accepted and then closed. [22]
  • By serving the end-user specially programmed Adobe Flash SWF applications or Sun Java applets that send HTTP calls back to their server.

CGI proxy

A CGI web proxy accepts target URLs using a Web form in the user's browser window, processes the request, and returns the results to the user's browser. Consequently, it can be used on a device or network that does not allow "true" proxy settings to be changed. The first recorded CGI proxy, named "rover" at the time but renamed in 1998 to "CGIProxy", [23] was developed by American computer scientist James Marshall in early 1996 for an article in "Unix Review" by Rich Morin. [24]

The majority of CGI proxies are powered by one of CGIProxy (written in the Perl language), Glype (written in the PHP language), or PHProxy (written in the PHP language). As of April 2016, CGIProxy has received about two million downloads, Glype has received almost a million downloads, [25] whilst PHProxy still receives hundreds of downloads per week. [26] Despite waning in popularity [27] due to VPNs and other privacy methods, as of September 2021 there are still a few hundred CGI proxies online. [28]

Some CGI proxies were set up for purposes such as making websites more accessible to disabled people, but have since been shut down due to excessive traffic, usually caused by a third party advertising the service as a means to bypass local filtering. Since many of these users do not care about the collateral damage they are causing, it became necessary for organizations to hide their proxies, disclosing the URLs only to those who take the trouble to contact the organization and demonstrate a genuine need. [29]

Suffix proxy

A suffix proxy allows a user to access web content by appending the name of the proxy server to the URL of the requested content (e.g. "en.wikipedia.org.SuffixProxy.com"). Suffix proxy servers are easier to use than regular proxy servers, but they do not offer high levels of anonymity, and their primary use is for bypassing web filters. However, this is rarely used due to more advanced web filters.

Tor onion proxy software

The Vidalia Tor-network map Vidalia-0.0.11-svn.png
The Vidalia Tor-network map

Tor is a system intended to provide online anonymity. [30] Tor client software routes Internet traffic through a worldwide volunteer network of servers for concealing a user's computer location or usage from someone conducting network surveillance or traffic analysis. Using Tor makes tracing Internet activity more difficult, [30] and is intended to protect users' personal freedom and their online privacy.

"Onion routing" refers to the layered nature of the encryption service: The original data are encrypted and re-encrypted multiple times, then sent through successive Tor relays, each one of which decrypts a "layer" of encryption before passing the data on to the next relay and ultimately the destination. This reduces the possibility of the original data being unscrambled or understood in transit. [31]

I2P anonymous proxy

The I2P anonymous network ('I2P') is a proxy network aiming at online anonymity. It implements garlic routing, which is an enhancement of Tor's onion routing. I2P is fully distributed and works by encrypting all communications in various layers and relaying them through a network of routers run by volunteers in various locations. By keeping the source of the information hidden, I2P offers censorship resistance. The goals of I2P are to protect users' personal freedom, privacy, and ability to conduct confidential business.

Each user of I2P runs an I2P router on their computer (node). The I2P router takes care of finding other peers and building anonymizing tunnels through them. I2P provides proxies for all protocols (HTTP, IRC, SOCKS, ...).

Comparison to network address translators

The proxy concept refers to a layer 7 application in the OSI reference model. Network address translation (NAT) is similar to a proxy but operates in layer 3.

In the client configuration of layer-3 NAT, configuring the gateway is sufficient. However, for the client configuration of a layer 7 proxy, the destination of the packets that the client generates must always be the proxy server (layer 7), then the proxy server reads each packet and finds out the true destination.

Because NAT operates at layer-3, it is less resource-intensive than the layer-7 proxy, but also less flexible. As we compare these two technologies, we might encounter a terminology known as 'transparent firewall'. Transparent firewall means that the proxy uses the layer-7 proxy advantages without the knowledge of the client. The client presumes that the gateway is a NAT in layer 3, and it does not have any idea about the inside of the packet, but through this method, the layer-3 packets are sent to the layer-7 proxy for investigation.[ citation needed ]

DNS proxy

A DNS proxy server takes DNS queries from a (usually local) network and forwards them to an Internet Domain Name Server. It may also cache DNS records.

Proxifiers

Some client programs "SOCKS-ify" requests, [32] which allows adaptation of any networked software to connect to external networks via certain types of proxy servers (mostly SOCKS).

Residential proxy (RESIP)

A residential proxy is an intermediary that uses a real IP address provided by an Internet Service Provider (ISP) with physical devices such as mobiles and computers of end-users. Instead of connecting directly to a server, residential proxy users connect to the target through residential IP addresses. The target then identifies them as organic internet users. It does not let any tracking tool identify the reallocation of the user. [33] Any residential proxy can send any number of concurrent requests, and IP addresses are directly related to a specific region. [34] Unlike regular residential proxies, which hide the user's real IP address behind another IP address, rotating residential proxies, also known as backconnect proxies, conceal the user's real IP address behind a pool of proxies. These proxies switch between themselves at every session or at regular intervals. [35]

Despite the providers assertion that the proxy hosts are voluntarily participating, numerous proxies are operated on potentially compromised hosts, including Internet of things devices. Through the process of cross-referencing the hosts, researchers have identified and analyzed logs that have been classified as potentially unwanted program and exposed a range of unauthorized activities conducted by RESIP hosts. These activities encompassed illegal promotion, fast fluxing, phishing, hosting malware, and more. [36]

See also

Related Research Articles

An Internet filter is software that restricts or controls the content an Internet user is capable to access, especially when utilized to restrict material delivered over the Internet via the Web, Email, or other means. Content-control software determines what content will be available or be blocked.

In computer security, a DMZ or demilitarized zone is a physical or logical subnetwork that contains and exposes an organization's external-facing services to an untrusted, usually larger, network such as the Internet. The purpose of a DMZ is to add an additional layer of security to an organization's local area network (LAN): an external network node can access only what is exposed in the DMZ, while the rest of the organization's network is protected behind a firewall. The DMZ functions as a small, isolated network positioned between the Internet and the private network.

<span class="mw-page-title-main">Squid (software)</span> Caching and forwarding HTTP web proxy

Squid is a caching and forwarding HTTP web proxy. It has a wide variety of uses, including speeding up a web server by caching repeated requests, caching World Wide Web (WWW), Domain Name System (DNS), and other network lookups for a group of people sharing network resources, and aiding security by filtering traffic. Although used for mainly HTTP and File Transfer Protocol (FTP), Squid includes limited support for several other protocols including Internet Gopher, Secure Sockets Layer (SSL), Transport Layer Security (TLS), and Hypertext Transfer Protocol Secure (HTTPS). Squid does not support the SOCKS protocol, unlike Privoxy, with which Squid can be used in order to provide SOCKS support.

SOCKS is an Internet protocol that exchanges network packets between a client and server through a proxy server. SOCKS5 optionally provides authentication so only authorized users may access a server. Practically, a SOCKS server proxies TCP connections to an arbitrary IP address, and provides a means for UDP packets to be forwarded. A SOCKS server accepts incoming client connection on TCP port 1080, as defined in RFC 1928.

Deep packet inspection (DPI) is a type of data processing that inspects in detail the data being sent over a computer network, and may take actions such as alerting, blocking, re-routing, or logging it accordingly. Deep packet inspection is often used for baselining application behavior, analyzing network usage, troubleshooting network performance, ensuring that data is in the correct format, checking for malicious code, eavesdropping, and internet censorship, among other purposes. There are multiple headers for IP packets; network equipment only needs to use the first of these for normal operation, but use of the second header is normally considered to be shallow packet inspection despite this definition.

The Invisible Internet Project (I2P) is an anonymous network layer that allows for censorship-resistant, peer-to-peer communication. Anonymous connections are achieved by encrypting the user's traffic, and sending it through a volunteer-run network of roughly 55,000 computers distributed around the world. Given the high number of possible paths the traffic can transit, a third party watching a full connection is unlikely. The software that implements this layer is called an "I2P router", and a computer running I2P is called an "I2P node". I2P is free and open sourced, and is published under multiple licenses.

An application firewall is a form of firewall that controls input/output or system calls of an application or service. It operates by monitoring and blocking communications based on a configured policy, generally with predefined rule sets to choose from. The two primary categories of application firewalls are network-based and host-based.

In computer networks, rate limiting is used to control the rate of requests sent or received by a network interface controller. It can be used to prevent DoS attacks and limit web scraping.

In computer networks, a tunneling protocol is a communication protocol which allows for the movement of data from one network to another. It can, for example, allow private network communications to be sent across a public network, or for one network protocol to be carried over an incompatible network, through a process called encapsulation.

Secure communication is when two entities are communicating and do not want a third party to listen in. For this to be the case, the entities need to communicate in a way that is unsusceptible to eavesdropping or interception. Secure communication includes means by which people can share information with varying degrees of certainty that third parties cannot intercept what is said. Other than spoken face-to-face communication with no possible eavesdropper, it is probable that no communication is guaranteed to be secure in this sense, although practical obstacles such as legislation, resources, technical issues, and the sheer volume of communication serve to limit surveillance.

<span class="mw-page-title-main">Reverse proxy</span> Type of proxy server

In computer networks, a reverse proxy is an application that sits in front of back-end servers and forwards client requests to those servers instead of having the client directly talking to the servers. Reverse proxies help increase scalability, performance, resilience and security. The resources returned to the client appear as if they originated from the web server itself.

Network transparency, in its most general sense, refers to the ability of a protocol to transmit data over the network in a manner which is not observable to those using the applications that are using the protocol. In this way, users of a particular application may access remote resources in the same manner in which they would access their own local resources. An example of this is cloud storage, where remote files are presented as being locally accessible, and cloud computing where the resource in question is processing.

<span class="mw-page-title-main">Microsoft Forefront Threat Management Gateway</span>

Microsoft Forefront Threat Management Gateway, formerly known as Microsoft Internet Security and Acceleration Server, is a discontinued network router, firewall, antivirus program, VPN server and web cache from Microsoft Corporation. It ran on Windows Server and works by inspecting all network traffic that passes through it.

<span class="mw-page-title-main">Proxy list</span>

A proxy list is a list of open HTTP/HTTPS/SOCKS proxy servers all on one website. Proxies allow users to make indirect network connections to other computer network services. Proxy lists include the IP addresses of computers hosting open proxy servers, meaning that these proxy servers are available to anyone on the internet. Proxy lists are often organized by the various proxy protocols the servers use. Many proxy lists index, which can be used without changing browser settings.

<span class="mw-page-title-main">HTTP 403</span> HTTP status code indicating that access is forbidden to a resource

HTTP 403 is an HTTP status code meaning access to the requested resource is forbidden. The server understood the request, but will not fulfill it, if it was correct.

An application delivery network (ADN) is a suite of technologies that, when deployed together, provide availability, security, visibility, and acceleration for Internet applications such as websites. ADN components provide supporting functionality that enables website content to be delivered to visitors and other users of that website, in a fast, secure, and reliable way.

Web Cache Communication Protocol (WCCP) is a Cisco-developed content-routing protocol that provides a mechanism to redirect traffic flows in real-time. It has built-in load balancing, scaling, fault tolerance, and service-assurance (failsafe) mechanisms. Cisco IOS Release 12.1 and later releases allow the use of either Version 1 (WCCPv1) or Version 2 (WCCPv2) of the protocol.

Internet censorship circumvention, also referred to as going over the wall or scientific browsing in China, is the use of various methods and tools to bypass internet censorship.

<span class="mw-page-title-main">SoftEther VPN</span> Open-source VPN client and server software

SoftEther VPN is free open-source, cross-platform, multi-protocol VPN client and VPN server software, developed as part of Daiyuu Nobori's master's thesis research at the University of Tsukuba. VPN protocols such as SSL VPN, L2TP/IPsec, OpenVPN, and Microsoft Secure Socket Tunneling Protocol are provided in a single VPN server. It was released using the GPLv2 license on January 4, 2014. The license was switched to Apache License 2.0 on January 21, 2019.

References

  1. Luotonen, Ari; Altis, Kevin (April 1994). "World-Wide Web Proxies" (PDF). Archived (PDF) from the original on 9 October 2016.
  2. Shapiro, Marc (May 1986). Structure and Encapsulation in Distributed Systems: the Proxy Principle. 6th International Conference on Distributed Computing Systems. Cambridge, MA, USA. pp. 198–204. inria-00444651. Archived from the original on 26 December 2018. Retrieved 26 December 2018.
  3. "Proxy servers and tunneling". MDN Web Docs. Archived from the original on 26 November 2020. Retrieved 6 December 2020.
  4. 1 2 Lyon, Gordon (2008). Nmap network scanning. US: Insecure. p. 270. ISBN   978-0-9799587-1-7.
  5. "Forward and Reverse Proxies". httpd mod_proxy. Apache. Archived from the original on 10 February 2011. Retrieved 20 December 2010.
  6. Suchacka, Grażyna; Iwański, Jacek (7 June 2020). "Identifying legitimate Web users and bots with different traffic profiles — an Information Bottleneck approach". Knowledge-Based Systems. 197: 105875. doi: 10.1016/j.knosys.2020.105875 . ISSN   0950-7051. S2CID   216514793.
  7. 1 2 "2010 Circumvention Tool Usage Report" (PDF). The Berkman Center for Internet & Society at Harvard University. October 2010. Archived (PDF) from the original on 18 January 2012. Retrieved 15 September 2011.
  8. "How to Check if Website is Down or Working Worldwide". Hostinger. 19 November 2019. Archived from the original on 14 December 2019. Retrieved 14 December 2019.
  9. "Using a Ninjaproxy to get through a filtered proxy". advanced filtering mechanics. TSNP. Archived from the original on 9 March 2016. Retrieved 17 September 2011.
  10. "Caching Proxy". www.ibm.com. Retrieved 2 July 2023.
  11. Thomas, Keir (2006). Beginning Ubuntu Linux: From Novice to Professional . Apress. ISBN   978-1-59059-627-2. A proxy server helps speed up Internet access by storing frequently accessed pages
  12. I. Cooper; J. Dilley (June 2001). Known HTTP Proxy/Caching Problems. IETF. doi: 10.17487/RFC3143 . RFC 3143 . Retrieved 17 May 2019.
  13. "Layering". Performance Enhancing Proxies Intended to Mitigate Link-Related Degradations. IETF. June 2001. p. 4. sec. 2.1. doi: 10.17487/RFC3135 . RFC 3135 . Retrieved 21 February 2014.
  14. "Hot Tactics For Geo-Targeted Ads on Google & Bing". October 2013. Archived from the original on 14 February 2014. Retrieved 7 February 2014.
  15. "Firewall and Proxy Server HOWTO". tldp.org. Archived from the original on 23 August 2011. Retrieved 4 September 2011. The proxy server is, above all, a security device.
  16. "Sneaker Bot Supreme Proxy". GeoSurf. Archived from the original on 24 September 2017. Retrieved 24 September 2017.
  17. "absolute-form". HTTP/1.1 Message Syntax and Routing. IETF. June 2014. p. 41. sec. 5.3.2. doi: 10.17487/RFC7230 . RFC 7230 . Retrieved 4 November 2017. a client MUST send the target URI in absolute-form as the request-target
  18. "Transparent Proxy Definition". ukproxyserver.org. 1 February 2011. Archived from the original on 1 March 2013. Retrieved 14 February 2013.
  19. "Socket Capable Browser Plugins Result in Transparent Proxy Abuse". The Security Practice. 9 March 2009. Archived from the original on 2 February 2010. Retrieved 14 August 2010.
  20. "Vulnerability Note VU#435052". US CERT. 23 February 2009. Archived from the original on 10 July 2010. Retrieved 14 August 2010.
  21. "Subversion Dev: Transparent Proxy detection (was Re: Introduction_". Tracetop.sourceforge.net. Archived from the original on 16 October 2015. Retrieved 16 November 2014.
  22. Wessels, Duane (2004). Squid The Definitive Guide . O'Reilly. pp.  130. ISBN   978-0-596-00162-9.
  23. Marshall, James. "CGIProxy". Archived from the original on 16 November 2018. Retrieved 12 November 2018.
  24. "The Limits of Control". June 1996. Archived from the original on 6 August 2020. Retrieved 12 November 2018.
  25. "Glype® Proxy Script". glype.com. Archived from the original on 3 January 2013. Retrieved 17 May 2019.
  26. "PHProxy". SourceForge. Archived from the original on 14 March 2016. Retrieved 7 April 2016.
  27. "Google Trends". Google Trends.
  28. "Proxy Stats :: Get Proxi.es". getproxi.es. Archived from the original on 1 September 2021. Retrieved 5 September 2021.
  29. Estrada-Jiménez, José (March 2017). "Online advertising: Analysis of privacy threats and protection approaches". Computer Communications. 100: 32–51. doi:10.1016/j.comcom.2016.12.016. hdl: 2117/99742 . S2CID   34656772.
  30. 1 2 Glater, Jonathan (25 January 2006). "Privacy for People Who Don't Show Their Navels". The New York Times. Archived from the original on 29 April 2011. Retrieved 4 August 2011.
  31. The Tor Project. "Tor: anonymity online". Archived from the original on 9 April 2010. Retrieved 9 January 2011.
  32. Zwicky, Elizabeth D.; Cooper, Simon; Chapman, D. Brent (2000). Building Internet Firewalls (2nd ed.). O'Reilly. p.  235. ISBN   978-1-56592-871-8.
  33. "What Is a Proxy Server and How Does It Work?". IPRoyal.com. 17 April 2023. Retrieved 2 July 2023.
  34. Smith, Vincent (2019). Go Web Scraping Quick Start Guide: Implement the power of Go to scrape and crawl data from the web. Packt Publishing Ltd. ISBN   978-1-78961-294-3. Archived from the original on 17 January 2023. Retrieved 19 November 2020.
  35. Keenan, James. "What are Residential Proxies?". Smartproxy.com. Archived from the original on 26 December 2021. Retrieved 26 December 2021.
  36. Mi, Xianghang; Feng, Xuan; Liao, Xiaojing; Liu, Baojun; Wang, XiaoFeng; Qian, Feng; Li, Zhou; Alrwais, Sumayah; Sun, Limin; Liu, Ying (May 2019). Resident Evil: Understanding Residential IP Proxy as a Dark Service. 2019 IEEE Symposium on Security and Privacy (SP). pp. 1185–1201. doi: 10.1109/SP.2019.00011 . ISBN   978-1-5386-6660-9. S2CID   132479013.