Audiogrep: Automatic Audio “Supercuts”

Audiogrep is a python script that transcribes audio files and then creates audio “supercuts” based on search phrases. It uses CMU Pocketsphinx for speech-to-text, and pydub to splice audio segments together.

This is a sister project to my videogrep script, which does a similar thing but with video (and makes use of subtitle tracks rather than speech-to-text).

So far I’ve mostly been experimenting with audio books. Here, for example, are all the phrases in How Google Works by Eric Schmidt and Jonathan Rosenberg that contain the word “data”.

And here are all the references to “private wealth” in Capital in the Twenty-first Century by Thomas Piketty:

You can also extract just individual words, rather than phrases.

For example, here are all instances of “money” and “people” from the book The Automatic Millionaire: A Powerful One-Step Plan to Live and Finish Rich by David Bach:

“Control”, “psychological”, “behavior” and “situations” from the nightmarishly titled Get Anyone to Do Anything: Never Feel Powerless Again — With Psychological Secrets to Control and Influence Every Situation by David J. Lieberman

And here’s “relax”, and “large” from Breast Enlargement Hypnosis, a truly remarkable audio experience by Victoria Gallagher.

Another experiment from the same amazing source:

It’s also possible to use the script to create “frankenstein” sentences. Here’s Bill Clinton telling us to stop voting, sourced from his book My Life:

And, by integrating moviepy, you can generate video slideshows like these or this:

The code is available on github. Next up I’ll be integrating some of this functionality into videogrep for more refined searches.

Oculus Oedipus

This is part 2 of a series of speculative virtual reality projects. Illustrations by David Tracy. Also appears in The New Inquiry.

Preparation

Figure 1: The family.

 

Figure 2:

Figure 2: 3D scanning of the mother.

 

Figure 3:

Figure 3: 3D scanning of the father.

 

Figure 4:

Figure 4: The mother and father are captured in virtual space.

 

Figure 5:

Figure 5: The user enters Oculus Oedipus.

 

Figure 6: “I’m really looking forward to this experience.”

 

Stage 1: Father

Figure 7:

Figure 7: A conflict on the road.

 

Figure 8:

Figure 8: The user murders the virtual father.

 

Figure 9:

Figure 9: The user reflects on his experience thus far.

Stage 2: Sphinx

Figure 10:

Figure 10: The riddle of the Sphinx.

 

Figure 11:

Figure 11: The user offers a response.

 

Figure  12:

Figure 12: The Sphinx is defeated.

Stage 3: Mother

Figure 13:

Figure 13: The user seduces the virtual mother.

 

Figure 14:

Figure 14: Consummation part one.

 

Figure 15: Consummation continues.

 

OedipusDraft-16

Figure 16: A more robust fantasy.

 

OedipusDraft-17

Figure 17: Climax

 

Figure 17:

Figure 18: The user is permanently blinded. The experience is concluded.

Postscript

Figure 18:

Figure 19: The blinded user wanders the world.

Oculus Birth

This is part 1 of a series of speculative virtual reality projects. Illustrations by David Tracy. Also appears in The New Inquiry.

OculusBirth-01

Figure 1: Conception.

 

OculusBirth-02

Figure 2: Two miniaturized GoPro cameras are affixed to the skull of the fetus.

 

OculusBirth-03

Figure 3: Gestation.

 

OculusBirth-04

Figure 4: “I’m really looking forward to this experience.”

 

OculusBirth-05

Figure 5: The live feed.

 

OculusBirth-06

Figure 6: The mother experiences the birth of her child from the child’s perspective.

 

OculusBirth-07

Figure 7: Crowning.

 

OculusBirth-08

Figure 8: “Happy Birthday.”

 

OculusBirth-09

Figure 9: The experience is concluded.

Case Study

Here is some initial documentation for “Case Study”, a project that I’m working on with Pedro G. C. Oliveira.

“Case Study” is a briefcase that analyzes and produces visualizations of literary/philosophical texts using weaponized natural language processing software originally developed by the military. The parsing software, which finds and tags geopolitical events and world-historical actors in news articles, was originally intended to be a tool to help governments make predictions about global trends and material conflicts. In Case Study, we use that same tool on literary and philosophical texts rather than news articles, producing visualizations that frame literary and philosophical events in the language of geopolitics and the military.

45550af845a39fafffbaefc840666974

954e5420057b1a74e05f0e59b4079b79

aa6034d677139a223c2c7ff68c9a4934

00157144429faf69f716f0c8c879340d

abd9b2e808b862befcdc1873cba0f341