Post45 data collective The Post45 Data Collective peer reviews and houses literary and cultural data from 1945 to the present on an open-access website designed, hosted, and maintained by Emory University’s Center for Digital Scholarship. VIEW OUR DATA HOW TO SUBMIT Terms of use These terms have been derived from Dataverse Project's recommendations for best
Variation kicks in when you look at the later years, consider multiple marriages, divorce, separation, and opposite-sex versus same-sex relationships. This chart breaks it all down.
Behind the Scenes of Spotify’s New AI DJ — Spotify
Since launching Spotify’s brand-new AI DJ in beta a few weeks back, Premium listeners across the U.S. and Canada have had the chance to experience our personalization capabilities in a whole new way. We’ve already seen so much love for DJ—both on-platform and across social media—and we’re not even out to 100% of users yet.... Read more »
Create stunning heatmaps, data visualizations and interactive maps with iipmaps.com - the powerful mapping platform for data storytelling and geographic insights.
From Space to Story in Data Journalism, Nightingale
“Two weeks ago today, a satellite whirled above Washington on its way around the earth and shot photographs from 400 miles up that could change the way...
We explore large-scale training of generative models on video data. Specifically, we train text-conditional diffusion models jointly on videos and images of variable durations, resolutions and aspect ratios. We leverage a transformer architecture that operates on spacetime patches of video and image latent codes. Our largest model, Sora, is capable of generating a minute of high fidelity video. Our results suggest that scaling video generation models is a promising path towards building general purpose simulators of the physical world.
I was relaxing on a beach during my summer leave when I received a mail from a reader that asked me if it is technically possible to write a virus using Python. The short answer: YES. The longer answer: yes, BUT… Let’s start by saying that viruses are a little bit anachronistic in 2021… nowadays other kinds of malware (like worms for example) are far more common than viruses. Moreover, modern operative systems are more secure and less prone to be infected than MS-DOS or Windows 95 were (sorry Microsoft…) and people are more aware of the risk of malware...
If you’re a fan of neural networks, you’ve probably heard of the ELU, GELU and SiLU activation functions. However these activation functions still are not so common, in this post we are going to know them a little more elu: The Exponential Linear Unit is a smooth approximation to the rectifier function. The main advantages is that generates negative outputs, which aid in guiding the network’s weights and biases in the desired directions, and the main cons is that increases the computation time. import numpy as np import seaborn as sns import matplotlib.pyplot as plt def elu(x,alpha): return [z if z 0 else alpha*(np.exp(z) -1) for z in x] x = np.random.randint(-10,10, 100) alpha = .5 y = elu(x, alpha) and plot to check graphically: plt.style.use('ggplot') g = sns.lineplot(x=x, y=y) g.axhline(0, ls='--', color='gray') g.axvline(0, ls='--', color='gray') g.set(ylim=(-1.5, 2)) plt.legend(labels=['elu']) plt.title('Exponential Linear Unit (ELU)') plt.xlabel('x') plt.ylabel('y') plt.show() The plot: /*! elementor - v3.7.4 - 31-08-2022 */ .elementor-widget-image{text-align:center}.elementor-widget-image a{display:inline-block}.elementor-widget-image a img[src$='.svg']{width:48px}.elementor-widget-image img{vertical-align:middle;display:inline-block} gelu: Gaussian Error Linear Unit is used for LM and transformer models like GPT-2 and BERT, this functions prevents the problem of vanishing gradients also unlike ELU, it has a continuous derivative at 0, which can sometimes make training faster. import math def gelu(x): return [0.5 * z * (1 + math.tanh(math.sqrt(2 / np.pi) * (z + 0.044715 * math.pow(z, 3)))) for z in x] y = gelu(x) plt.style.use('ggplot') g = sns.lineplot(x=x, y=y) g.axhline(0, ls='--', color='gray') g.axvline(0, ls='--', color='gray') g.set(xlim=(-4, 4)) g.set(ylim=(-0.5, 2)) plt.xlabel('x') plt.ylabel('y') plt.legend(labels=['gelu']) plt.title('Gaussian Error Linear Unit (GELU)') plt.show() The output: silu: Sigmoid Linear Units, it serves as a smooth approximation to the ReLU def sigmoid(x_elem): return 1/(1 + np.exp(-x_elem)) def silu(x, theda = 1.0): return [x_elem * sigmoid(theda *x_elem) for x_elem in x] y = silu(x) plt.style.use('ggplot') g = sns.lineplot(x=x, y=y) g.axhline(0, ls='--',... Continue Reading