Tuesday, August 30, 2022

Holy Mackerel! Been totally sucked into A.I. image generation for the last year.

Saturday, June 18, 2022

Been having a crazy good time for the last year working in deep learning with GPUs. All those experiences with Ducer Enterprises back in the late 90s and early aughts have really payed off. Here's a great example. I fed several days worth of Physical Review B headlines into a neural net and produced interesting images related to condensed matter and material physics.

Friday, February 16, 2018

A novel method for discovering anti-aging oligopeptides

After a recent presentation at the Rogue Hack Lab by a bright young man named Luke Bryan on support vector machines I found my mind wandering down interesting new corridors while asking myself how these SVMs could be used in a novel fashion.

It occurred to me that a cascading stack of these could be used to bifurcate any data with an inherent vector or bias into subgroups/subsequences in an efficient spectrum fashion.

It popped into my mind that one prominent inherent bias in genomic information is the life-span duration of species. It is a simple matter to arrange a collection of peptide coding genomic data along this spectrum of life-span lengths and train a cascade of SVMs to discern where along this spectrum any block of DNA would fall.

To move this towards actual discovery of novel oligopeptides one would then just need to train an LSTM recurrent neural net on samples of DNA sequences from along this spectrum in order to generate novel oligopeptide which share the probabilistic characters of the original spectrum. The resultant outputs when passed through the above trained SVMs would be scored by where they fell in the spectrum. Any peptides sequences which scored past the longest lived organisms would by definition be from the informational sampling space of an organism with super-spectral lifespan, even if the hypothetical organism doesn't actually exist.

Here's a graphic to help demonstrate the method:
Edit (2/20/2018) interesting supportive work using a related but different method drawn from GAN construction: Generating and designing DNA with deep generative models

Friday, May 19, 2017

Deep Learning part 1

Back in the ol' days of the late 90s, starting around '96 I started a small software company named Ducer.net with a couple of high school buddies. You can find the last few years of our fading website in the internet archive wayback machine starting around 2001 but by then we had drifted far from our original purpose.

My original business plan had been built around neural nets and training them on distributed networked machines kinda like Folding@Home or Seti@Home etc. We would hornswaggle our friends and family into running a small screen-saver type app which would train neural nets and upload the resulting weight sets to a central server where they would compete in a genetic algorithm arrangement. Worked great but never earned us a penny or drew in any funding. The AI winter was in full force and no one would talk to an egghead with a big idea.

Now with the advent of GPUs and the new TPUs the sort of deep learning we were just starting to play with when we folded up in 2003 is beyond the "Hot New Thing" so I can't help but appreciate the subtle irony.

Saturday, February 21, 2015

A day at the spa

[ From the Glassman anthologies. These aren't really numbered or in any order so &etc. ]

A day at the Spa

    I swapped my ultrasilk suit for one of the medicinal bathrobes in the dressing room. The escalator swept me downstairs to one of the private vat rooms. I could still afford a little aloneness. Before getting into the gel I set the walls on four sweeping panoramas from around the globe. Gobi desert, Florida Keys underwater, Australian reefs, Oregon rainforest mountain view from 6500 feet. Flicked on a randomize routine to slowly drift the views. So glad they never privatized the global reserves.

    Sometimes I run the process as part of a sensory deprivation trip. Not even being in space can produce a better zero G than the med tanks.

    Climb into the tank slowly, allowing the dendrites to scan my skin first. Feet always need the extra work anyhow. I still walk almost everywhere I go. When the medgel finally reaches my neck,  pause for a second to steel my mental discipline. Many people still get sedated for this part. Gradually dunk my head, swallowing a couple pints first as my mouth goes under. The paroxysm of coughing and choking that comes with inhaling the goo is mostly mental the techs tell me. Yet I’ve seen many of them go under sedated. I endure it anyways, finally falling back into my natural breathing rhythm as the pseudo-fluid establishes itself in my lungs and extrudes a tiny bit of analgesic to numb my autonomic response to its presence.

    Here’s another reason many people don’t like to remain awake during the process. The faintly pink gel around me begins to grow cloudy. Strange tugging sensations become noticeable in my extremities, the gel grows thicker, almost stiff in places of contact. Nanothreads are penetrating my skin in places, stripping off dead skin cells here and there, tugging other cells into newer, better overlapping regions.

    The threads running into my muscles are checking for strain damage, pumping out any toxic metals I might have picked up, sorting cell arrangements for maximum shear resistance.

    Disturbing gray streamers are emerging from my mouth. All that dust and smoke I picked up while working on the apple trees. I’m glad my hash-patch works well enough I don’t need to smoke anymore. Even with nano, cancer surgery is still expensive.

    My teeth and digestive system tingle and twitch. “Three microcavities sealed” the readout dumps into one of my headscreens. I won’t bother describing the readout on my other internal organs. All shipshape, except for a tiny bit of liver repair; always gotta watch my self when I go to one of Steve’s parties.

    I only get my bones done once a year. Some things I feel are best left to nature.

    I haven’t requested any installations or modifications this time, so when the medthreads have finished healing what they can and pumping my blood and tissue full of Optimum, they drain the fluid away at last. At the very end a fogger comes on (that's new!) as I puke and cough up the last bits of gel. Almost no sensation with that by the way, medgel is very efficient at diminishing discomfort. The fog lays down a seal coating to cut down on UV for a couple of days, I datagaze.

    Climbing back into my robe, I feel it fluff up and throb with warm, comforting vibes. It is no doubt checking with my implants and the medical database to make sure that everything went fine. Triple, triple, triple redundancy. My refreshed nerves are minutely sensitive to the vibrations of the terry cloth tendrils as they scrape over my skin, looking for any possible stray threads or tissue damage left by the gel.

    I feel a faint affirmation ding! in the back of my mind as I return to my dressing room. My coverall has also freshly cleaned itself and done a long diagnostic. Swap out the robe for the strong sense of purpose running in the old smudged tan jampsuit. I flick the colorswirl and pick out a late summer scene to match my return to work.

    All is right with the world, or at least my part of it.
   
   

Wednesday, February 11, 2015

Glassman anthology installation #7: DawnGazer on the slope.

I can feel the sun warming my fronds and tendrils. The slope is coming alive around me as an eye opens. A few seconds of bewilderment as a sensory stream aligns with categorizers. A few seconds later 3 more eyes pop open as my mics pick up the first chirping of slope quail and the mournful hoot of doves.

"Time to get up" one my minds thinks as another gazes restlessly at a column of ghosts-in-memory. A few  hundred tiny processes scatter as I put my foot down from the hammock. A gentle breeze sways and pulls a few dozen of my threads up into the slowly warming air.

The body moves easily in the cool shade of morning. Fibrils and tendrils wrap this frail form supplying much of the locomotion but I'm still proud of the pulse that pushes fluid. No need to replace that yet my scanners tell me.

A sparkle of gigs swarm through my neighborhood, bipping and booping into some of the processes I shed earlier. I love how the sand, twigs and pebbles of my vicinity have been arranged to tile Escher and McGregor. So funny what just a few hundred pounds of process can do overnight with a full days solar siesta.

I think it'll be a good day for flight, thermals are pulling up fast and I've got that extra little spring in my step already. Good ol' adrenal glands once more into the breach.

A flashing red diamond sparkles and screams hanging from the branch of a scraggly oak. Dang! storm warnings later. Checking the radar, a mind fragment says I've got hours yet so I send a jolt up the thread lines, energizing my entire bloom. Minutes later my feet leave the ground.

(Zeep-zeep-zeep-zeep) the call coming in slows my ascent as I check its mood and origin. {IRL://Thunderstump_wanderlamp} Ok, I'll read it: Goingup?Stormwarn?Backdownearly?Short-hop?StayAlert.

Each thought glyph highlights a nicely colored graph expanse of possibilities. Always interesting to see what a complex_transform will make of casual actions. Two of me split off and reshape the vector to more closely match my intention. A spiraling affirmation of delight is returned when they are done.

Saturday, May 19, 2012

A salient analysis, from Reddit of all places

Found this brilliant analysis/refutation on Reddit of all places. from Thloughts
 Nobody tell Noam Chomsky ok?
This is wrong and long disproven. For even the most rudimentary of perceptual mechanisms, like the first layers in the visual cortex, neurons can act as feature detectors by learning which neurons in lower levels are likely to fire together or which will not fire if others are firing. Further layers find out when those feature detectors neurons are firing together or not, and so on up. Go a few layers up and now the brain is effectively aggregating all those first neuron reponses into much more abstract things, like regions of rapidly varying light intensity or hue (edges of objects in the visual field) or regions of similar or slowly varying color (like surfaces). Later on this information gets glommed together with information from different senses, like our kinesthetic sense, so that at higher and higher levels of abstraction we can do more interesting simulations, like mental rotations of figures.
Every step in that process is "symbology", a little piece of machinery that compactly represents lower level sensory data. People do not have stored up memories of sensory experience at the level of photons impinging on the retina. We are not cameras that can be forced to replay a scene. Rather, the states of feature detector neurons, which can represent the gist of a scene, are remembered, and this information can be used in reverse, with the feature detectors now acting as feature controllers which re-activate lower levels of features, in order to approximately reconstruct scenes. Language is built on thoughts, which are made of concepts, which are abstracted from perception. And every one of those layers is supported by more math and science than found in that book.
There are no "signs" floating around in our minds. To develop a real understanding, you need to know what the neural machinery is doing and how that behavior implements relevant algorithms. And we do know a lot of that. And no, when the results came in, the answer was not that "language is actually the structure that the human mind is made of". Language comes from the human mind; saying, of absolutely any cognitive phenomenon or process that "language did it" can not be an explanation. It's like saying lightning is responsible for electromagnetism.