Home / Chroniques / Is fog computing the future of databases?
π Digital

Is fog computing the future of databases?

PIERRE_Guillaume
Guillaume Pierre
professor in computer science at the University of Rennes
Key takeaways
  • Fog computing is a decentralised computing infrastructure, where multiple small machines are geographically dispersed and placed close to the users.
  • The technology is in full development and helps to reduce the travel time of data streams to cloud data centres.
  • Fog computing is already used in industry, and is of interest to many fields such as agriculture, health and tourism.
  • It allows for greater speed and efficiency, which is very useful for applications that require interactivity, such as augmented reality or video games.
  • The system is not intended to replace the cloud, but can address some of its limitations: high-energy consumption, saturation, latency, etc.

Send­ing an e‑mail, watch­ing a video on You­Tube, organ­ising a video con­fer­ence meet­ing, or play­ing an online game… Each of these activ­it­ies requires sig­ni­fic­ant data flows to and from serv­ers, loc­ated in data centres. The Cloud is the pre­ferred remote pro­cessing and stor­age sys­tem for devel­op­ing all the Inter­net applic­a­tions we use every day. But oth­er decent­ral­ised com­put­ing infra­struc­tures exist. Fog com­put­ing is grow­ing in pop­ular­ity. Accord­ing to spe­cial­ist con­sult­ant Future Mar­ket Insights, the glob­al fog com­put­ing mar­ket is expec­ted to reach $2.2 bil­lion by 2032, up from $196.6 mil­lion in 2022.

Shortening the data journey

What is fog com­put­ing? “It is a highly decent­ral­ised cloud, with small, geo­graph­ic­ally dis­persed com­put­ing units, closer to the data sources, and there­fore closer to the users,” says Guil­laume Pierre, a pro­fess­or of com­puter sci­ence at the Uni­ver­sity of Rennes, who is cur­rently work­ing on this tech­no­logy. Indeed, users’ data travels back and forth to data centres that are gen­er­ally very far away, which may be loc­ated in anoth­er coun­try or con­tin­ent, and which con­sume a lot of energy. Fog com­put­ing makes it pos­sible to shorten the routes of these flows. Guil­laume Pierre is work­ing on the use of small machines, the size of a cred­it card, often used to teach com­put­ing, the Rasp­berry Pi. 

This infra­struc­ture there­fore responds to cer­tain lim­it­a­tions of the Cloud. How­ever, Fog com­put­ing will not replace the Cloud, warns the com­puter sci­ence pro­fess­or: “Fog com­put­ing is rather the exten­sion of the Cloud into new ter­rit­or­ies, new types of needs.” Its main interest: the speed and effi­ciency of data trans­mis­sion. Fog com­put­ing can there­fore be par­tic­u­larly use­ful when the applic­a­tion used requires the low­est pos­sible response time, such as aug­men­ted real­ity or video games.

Fog com­put­ing is the exten­sion of the cloud into new ter­rit­or­ies, new types of needs.

“When we move towards demand­ing usage scen­ari­os, the response time can be sig­ni­fic­ant enough that inter­activ­ity is com­prom­ised and the applic­a­tion func­tions poorly or not at all,” says Guil­laume Pierre. For vir­tu­al or aug­men­ted real­ity, for example, experts say that if the time between a move­ment and the dis­play update exceeds a delay of 20 mil­li­seconds, the user may suf­fer from sea­sick­ness because the objects appear to be unstable. “Fog com­put­ing can be a solu­tion to reduce the latency between the user and the game,” says the professor.

An advantage for the Internet of Everything 

The oth­er advant­age of fog com­put­ing is the devel­op­ment of the Inter­net of Things. Con­nec­ted objects such as smart­phones, tab­lets, cars or smart TVs are now ubi­quit­ous. This is also the case in many areas such as industry, agri­cul­ture, sci­entif­ic research, urb­an plan­ning and secur­ity. Con­nec­ted objects, such as a tem­per­at­ure sensor, a cam­era or an energy meter, pro­duce data at cer­tain points. “When we work with sci­ent­ists who observe the flood­ing of a river, for example, it can be inter­est­ing to pro­cess the data on the spot so as to be able to pro­gramme reac­tions such as chan­ging the fre­quency of meas­ure­ments, apply­ing a par­tic­u­lar type of treat­ment, etc.”, explains Guilaume Pierre.

Is fog com­put­ing a solu­tion to the envir­on­ment­al chal­lenges posed by data centres? Accord­ing to Guil­laume Pierre, the answer is unclear. Indeed, these centres con­sume a lot of energy, but they are also well optim­ised. Fog tech­no­logy is still being developed. “If we do things badly, it is pos­sible to con­sume more. On the oth­er hand, we may have access to more renew­able energy to power the small machines. We are think­ing about ways to power tools with renew­able sources, per­haps sol­ar pan­els, which would reduce the eco­lo­gic­al impact con­sid­er­ably,” says the specialist. 

Our massive use of digit­al tech­no­logy could also lead to a stor­age crisis. Accord­ing to research­ers at Aston Uni­ver­sity in Eng­land, the cloud will reach sat­ur­a­tion point, with a 300% increase in the amount of data in the world in the next three years (study pub­lished in Decem­ber 2022). “Organ­ising stor­age sys­tems based on smal­ler units may be part of the solu­tion, espe­cially if the data is already dis­persed at the out­set, such as with the Inter­net of Things,” says Guil­laume Pierre.

Towards the massification of the technology? 

Bey­ond the com­par­is­on with the Cloud, research into Fog com­put­ing has already led to con­crete advances. Guil­laume Pierre coordin­ated the European Fog­Guru doc­tor­al train­ing pro­ject. Eight doc­tor­al stu­dents were able to work with the city of Valen­cia, in Spain, on water con­sump­tion, a major issue in this semi-desert area. The city has been deploy­ing smart meters, sim­il­ar to Ene­dis’ Linky device, for the past 15 years: the team of research­ers developed the applic­a­tion that pro­cesses the data in order to inter­vene more quickly in the event of a water leak affect­ing con­sumers. Pre­vi­ously, the response time was between three and six days. With Fog com­put­ing, data is trans­mit­ted more fre­quently and effi­ciently, and the response time is reduced to only a few hours, thus avoid­ing the waste of this pre­cious resource.

Fog com­put­ing is also already being deployed in industry. Tele­phone oper­at­ors are also very inter­ested in devel­op­ing this tech­no­logy, as are museums and tour­ist offices, which see it as a way of offer­ing smooth and fast anim­a­tions and sim­u­la­tions. The uses are var­ied: sci­entif­ic data pro­cessing, video games, indus­tri­al pro­cesses, res­taur­ants that want to optim­ise their fre­quent­a­tion, med­ic­al ana­lyses, vis­its to a city or an exhib­i­tion in aug­men­ted real­ity, etc.

So what is miss­ing for the large-scale devel­op­ment of this tech­no­logy? “What we don’t have today are gen­er­al sys­tems, where any­one can deploy their applic­a­tion in the Fog, and where sev­er­al uses can coex­ist. This mas­sific­a­tion will take a little time, because we lack applic­a­tion deploy­ment and plat­form man­age­ment tech­no­lo­gies, which we are cur­rently work­ing on. With­in ten years or so, we will have this tech­no­logy deployed and avail­able, per­haps not for indi­vidu­als, but cer­tainly for com­pan­ies,” says Guil­laume Pierre.

Sirine Azouaoui 

Support accurate information rooted in the scientific method.

Donate