Geo-replication systems are designed to provide improved availability and disaster tolerance by using geographically distributed data centers. [1] This is intended to improve the response time for applications such as web portals. Geo-replication can be achieved using software, hardware or a combination of the two.
Geo-replication software is a network performance-enhancing technology that is designed to provide improved access to portal or intranet content for uses at the most remote parts of large organizations. It is based on the principle of storing complete replicas of portal content on local servers, and then keeping the content on those servers up-to-date using heavily compressed data updates.
Geo-replication technologies are used to provide replication of the content of portals, intranets, web applications, content and data between servers, across wide area networks WAN to allow users at remote sites to access central content at LAN speeds.
Geo-replication software can improve the performance of data networks that suffer limited bandwidth, latency and periodic disconnection. Terabytes of data can be replicated over a wide area network, giving remote sites faster access to web applications.
Geo-replication software uses a combination of data compression and content caching technologies. differencing technologies can also be employed to reduce the volume of data that has to be transmitted to keep portal content accurate across all servers. This update compression can reduce the load that portal traffic place on networks, and improve the response time of a portal.
Remote users of web portals and collaboration environments will frequently experience network bandwidth and latency problems which will slow down their experience of opening and closing files, and otherwise interacting with the portal. Geo-replication technology is deployed to accelerate the remote end user portal performance to be equivalent to that experienced by users locally accessing the portal in the central office.
To deliver this reduction in the size of the required data updates across a portal, geo-replication systems often use differencing engine technologies. These systems are able to difference the content of each portal server right down to the byte level. This knowledge of the content that is already on each server enables the system to rebuild any changes to the content on one server, across each of the other servers in the deployment from content already hosted on those other servers. This type of differencing system ensures that no content, at the byte level, is ever sent to a server twice.
Geo-replication systems are often extended to deliver local replication beyond the server and down to the laptop used by a single user. Server to laptop replication enables mobile users to have access to a local replica of their business portal on a standard laptop. This technology may be employed to provide in the field access to portal content by, for example, sales forces and combat forces.
See alsoRelated Research ArticlesIn computing, a cache is a hardware or software component that stores data so that future requests for that data can be served faster; the data stored in a cache might be the result of an earlier computation or a copy of data stored elsewhere. A cache hit occurs when the requested data can be found in a cache, while a cache miss occurs when it cannot. Cache hits are served by reading data from the cache, which is faster than recomputing a result or reading from a slower data store; thus, the more requests that can be served from the cache, the faster the system performs. In computer networking, a thin client is a simple (low-performance) computer that has been optimized for establishing a remote connection with a server-based computing environment. The server does most of the work, which can include launching software programs, performing calculations, and storing data. This contrasts with a fat client or a conventional personal computer; the former is also intended for working in a client–server model but has significant local processing power, while the latter aims to perform its function mostly locally. An intranet is a computer network for sharing information, collaboration tools, operational systems, and other computing services within an organization, usually to the exclusion of access by outsiders. The term is used in contrast to public networks, such as the Internet, but uses most of the same technology based on the Internet Protocol Suite. A web portal is a specially designed website that brings information from diverse sources, like emails, online forums and search engines, together in a uniform way. Usually, each information source gets its dedicated area on the page for displaying information ; often, the user can configure which ones to display. Variants of portals include mashups and intranet "dashboards" for executives and managers. The extent to which content is displayed in a "uniform way" may depend on the intended user and the intended purpose, as well as the diversity of the content. Very often design emphasis is on a certain "metaphor" for configuring and customizing the presentation of the content and the chosen implementation framework or code libraries. In addition, the role of the user in an organization may determine which content can be added to the portal or deleted from the portal configuration. Satellite Internet access is Internet access provided through communication satellites. Modern consumer grade satellite Internet service is typically provided to individual users through geostationary satellites that can offer relatively high data speeds, with newer satellites using Ku band to achieve downstream data speeds up to 506 Mbit/s. In addition, new satellite internet constellations are being developed in low-earth orbit to enable low-latency internet access from space. A content delivery network, or content distribution network (CDN), is a geographically distributed network of proxy servers and their data centers. The goal is to provide high availability and performance by distributing the service spatially relative to end users. CDNs came into existence in the late 1990s as a means for alleviating the performance bottlenecks of the Internet as the Internet was starting to become a mission-critical medium for people and enterprises. Since then, CDNs have grown to serve a large portion of the Internet content today, including web objects, downloadable objects, applications, live streaming media, on-demand streaming media, and social media sites. Linux Terminal Server Project (LTSP) is a free and open source terminal server for Linux that allows many people to simultaneously use the same computer. Applications run on the server with a terminal known as a thin client handling input and output. Generally, terminals are low-powered, lack a hard disk and are quieter and more reliable than desktop computers because they do not have any moving parts. Information Logistics (IL) deals with the flow of information between human and / or machine actors within or between any number of organizations that in turn form a value creating network. IL is closely related to information management, information operations and information technology. A dedicated hosting service, dedicated server, or managed hosting service is a type of Internet hosting in which the client leases an entire server not shared with anyone else. This is more flexible than shared hosting, as organizations have full control over the server(s), including choice of operating system, hardware, etc. A mashup, in web development, is a web page or web application that uses content from more than one source to create a single new service displayed in a single graphical interface. For example, a user could combine the addresses and photographs of their library branches with a Google map to create a map mashup. The term implies easy, fast integration, frequently using open application programming interfaces and data sources to produce enriched results that were not necessarily the original reason for producing the raw source data. The term mashup originally comes from creating something by combining elements from two or more sources. Replication in computing involves sharing information so as to ensure consistency between redundant resources, such as software or hardware components, to improve reliability, fault-tolerance, or accessibility. A computer network is a set of computers sharing resources located on or provided by network nodes. The computers use common communication protocols over digital interconnections to communicate with each other. These interconnections are made up of telecommunication network technologies, based on physically wired, optical, and wireless radio-frequency methods that may be arranged in a variety of network topologies. In computer science, storage virtualization is "the process of presenting a logical view of the physical storage resources to" a host computer system, "treating all storage media in the enterprise as a single pool of storage." An enterprise portal, also known as an enterprise information portal (EIP), is a framework for integrating information, people and processes across organizational boundaries in a manner similar to the more general web portals. Enterprise portals provide a secure unified access point, often in the form of a web-based user interface, and are designed to aggregate and personalize information through application-specific portlets. WAN optimization is a collection of techniques for improving data transfer across wide-area networks (WANs). In 2008, the WAN optimization market was estimated to be $1 billion, and was to grow to $4.4 billion by 2014 according to Gartner, a technology research firm. In 2015 Gartner estimated the WAN optimization market to be a $1.1 billion market. Desktop virtualization is a software technology that separates the desktop environment and associated application software from the physical client device that is used to access it. A home server is a computing server located in a private computing residence providing services to other devices inside or outside the household through a home network or the Internet. Such services may include file and printer serving, media center serving, home automation control, web serving, web caching, file sharing and synchronization, video surveillance and digital video recorder, calendar and contact sharing and synchronization, account authentication, and backup services. An intranet portal is the gateway that unifies access to enterprise information and applications on an intranet. It is a tool that helps a company manage its data, applications, and information more easily through personalized views. Some portal solutions are able to integrate legacy applications, objects from other portals, and handle thousands of user requests. In a corporate enterprise environment, it is also known as an enterprise portal. Open Cobalt is a free and open-source software platform for constructing, accessing, and sharing virtual worlds both on local area networks or across the Internet, with no need for centralized servers. Intrexx is a cross-platform integrated development environment for the creation and operation of multilingual web-based applications, intranets, social intranets, enterprise portals and customer portals (extranets) as well as Industry 4.0 solutions as of 2018. A portal is created based on the drag and drop principle. Intrexx is a low-code development platform. Most applications can be created via drag & drop but manual coding can be added where necessary. References
|