DXOMARK sur Twitter
“It's #InternationalBatteryDay! 🔋 In a poll this week, we asked if you knew what #smartphone app drains your battery the most. The correct response: the camera app at 960 mA, and nearly all major internal components of the device from preview to the post-processing are used! 📱”
Let’s Architect! Architecting for sustainability | Amazon Web Services
Sustainability is an important topic in the tech industry, as well as society as a whole, and defined as the ability to continue to perform a process or function over an extended period of time without depletion of natural resources or the environment. One of the key elements to designing a sustainable workload is software […]
sverweij/dependency-cruiser: Validate and visualize dependencies. Your rules. JavaScript, TypeScript, CoffeeScript. ES6, CommonJS, AMD.
Validate and visualize dependencies. Your rules. JavaScript, TypeScript, CoffeeScript. ES6, CommonJS, AMD. - sverweij/dependency-cruiser: Validate and visualize dependencies. Your rules. JavaScript...
(1) How a Hackathon Is Slowly Changing The World | LinkedIn
Behind many corporate sustainability announcements is a grassroots story of a few employees who were convinced of the opportunity to make a positive impact on both the climate and their business. What started out as a hackathon project three years ago to demonstrate the potential of carbon aware com
Carbon Emissions and Large Neural Network Training
The computation demand for machine learning (ML) has grown rapidly recently,
which comes with a number of costs. Estimating the energy cost helps measure
its environmental impact and finding greener strategies, yet it is challenging
without detailed information. We calculate the energy use and carbon footprint
of several recent large models-T5, Meena, GShard, Switch Transformer, and
GPT-3-and refine earlier estimates for the neural architecture search that
found Evolved Transformer. We highlight the following opportunities to improve
energy efficiency and CO2 equivalent emissions (CO2e): Large but sparsely
activated DNNs can consume 1/10th the energy of large, dense DNNs without
sacrificing accuracy despite using as many or even more parameters. Geographic
location matters for ML workload scheduling since the fraction of carbon-free
energy and resulting CO2e vary ~5X-10X, even within the same country and the
same organization. We are now optimizing where and when large models are
trained. Specific datacenter infrastructure matters, as Cloud datacenters can
be ~1.4-2X more energy efficient than typical datacenters, and the ML-oriented
accelerators inside them can be ~2-5X more effective than off-the-shelf
systems. Remarkably, the choice of DNN, datacenter, and processor can reduce
the carbon footprint up to ~100-1000X. These large factors also make
retroactive estimates of energy cost difficult. To avoid miscalculations, we
believe ML papers requiring large computational resources should make energy
consumption and CO2e explicit when practical. We are working to be more
transparent about energy use and CO2e in our future research. To help reduce
the carbon footprint of ML, we believe energy usage and CO2e should be a key
metric in evaluating models, and we are collaborating with MLPerf developers to
include energy usage during training and inference in this industry standard
benchmark.
Matériels d’infrastructures informatiques et services d’hébergement / solutions hébergées
MiNumEco, la mission interministérielle numérique écoresponsable pilotée par la DINUM et le Ministère de la Transition écologique pour réduire les impacts environnementaux du numérique