A salient analysis, from Reddit of all places
Found this brilliant analysis/refutation
on Reddit of all places. from Thloughts
Nobody tell Noam Chomsky ok?
is wrong and long disproven. For even the most rudimentary of perceptual
mechanisms, like the first layers in the visual cortex, neurons can act
as feature detectors by learning which neurons in lower levels are
likely to fire together or which will not fire if others are firing.
Further layers find out when those feature detectors neurons are firing
together or not, and so on up. Go a few layers up and now the brain is
effectively aggregating all those first neuron reponses into much more
abstract things, like regions of rapidly varying light intensity or hue
(edges of objects in the visual field) or regions of similar or slowly
varying color (like surfaces). Later on this information gets glommed
together with information from different senses, like our kinesthetic
sense, so that at higher and higher levels of abstraction we can do more
interesting simulations, like mental rotations of figures.
Every step in that process is "symbology", a little piece of
machinery that compactly represents lower level sensory data. People do not
have stored up memories of sensory experience at the level of photons
impinging on the retina. We are not cameras that can be forced to replay
a scene. Rather, the states of feature detector neurons, which can
represent the gist of a scene, are remembered, and this information can
be used in reverse, with the feature detectors now acting as feature
controllers which re-activate lower levels of features, in order to
approximately reconstruct scenes. Language is built on thoughts, which
are made of concepts, which are abstracted from perception. And every
one of those layers is supported by more math and science than found in
There are no "signs" floating around in our minds. To develop a real
understanding, you need to know what the neural machinery is doing and
how that behavior implements relevant algorithms. And we do know a lot
of that. And no, when the results came in, the answer was not that
"language is actually the structure that the human mind is made of".
Language comes from the human mind; saying, of absolutely any
cognitive phenomenon or process that "language did it" can not be an
explanation. It's like saying lightning is responsible for
A surfeit of labor
A computer used to be a person who sat at a desk with slide rules, trigonometry charts, maybe an abacus and performed calculations.
I get to wondering about this definition and what it means in terms of man-hours (pardon the sexism) spent doing work. Let's do a simple thought experiment.
Let's say that you have a well trained computer and she can perform any mathematical function including something complex like an algebraic equation with trigonometric function in 1 second. That's 21,600 calculations per hour. Times a 40 hour work week, 864000 calculations per week. Times 50 weeks, 43.2 million calculations a year. Times an optimum 80 year work life, a little under 34.6 billion calculation in a human computer life.
Without pursuing the whole geometric progression of accelerating innovation, just thinking about the number of CPUs being made in the years 2011-2012, how many millions or billions of human lifetimes of labor are present in the earth's current stockpile of processors by this standard?
According to wikipedia
as of 2010, the fastest six-core PC processor reached 109 GFLOPS. That's 18 GFLOPS or so per processor. Let's stretch the above figures and say 36 billion calculations in a human lifetime. So that is 1 human lifetime's worth of calculations every 2 seconds if we equate flops to a single human calculation.
What is the financial value of this calculating resource at minimum wage?
Update [interrupted by work in the greenhouse] :
I don't mean for just 1 processor, but as I mentioned above, for all the earth's processors? Unlike humans, processors run regardless of sleep cycles, granted most of them are shut off by their human users so let's stick with the 7.2 million seconds per year as a duty cycle. 7.2 times divided by 2 makes 3.6 million human life-labors per processor per year. I'm still trying to run down the figures on annual CPU production.
Update 2: Ok, well, this will do. A ball park figure from this ZDNet article
places last year's production of PCs at around 353 million possibly climbing to 500 million in 2012. So obviously there are way more CPUs than that. But let's take it easy and round it down to 250 million for both of the last 2 years. That brings us to 500 million processors arguably extent and running on the planet. 3.6 times 500 makes 1,800 trillion human lifetimes worth of calculating available for our use on the planet.
Hopefully I've fudged enough on the downside to cover any errors in my figuring.
Labels: computation, computer(human), CPU, labor cost, processing power
The promise of broadband haptics and sensory mapping.
while nokia focuses on the simple use of this technique as a substitute for ring tones
it occurs to me that with a little tweaking quite a bit more information could be routed to our attention through the wasted biological bandwidth of human skin nerves. With some practice I think you could "listen" through such a patch. Just start out simultaneously feeding the audio stream to skin and ear then gradually lower the ear amplitude as the skin nerves are adapted to carry the audio path.
More dreams of near term nano
Nowadays when one goes in for a CT scan on your digestive organs you get this sort of milkshake like drink you have to swallow, a contrast agent. Or if you are unlucky enough to need a scope you'll have assorted piping slithered in one opening or the other. Your best hope is to get one of those pill cameras.
I'm envisioning a different approach based not on x-rays and radiation blocking contrast agents or intrusive devices but instead on food science and fiber optics. A swallowable transparent gel or syrup like solution which would make an excellent optical wave guide. Even a thin film of it persistent throughout each systalic wave would act as a waveguide. Low power laser emitters in the straw would supply pulses of light to each swallow, the light would propagate through the liquid and back scatter during dark phases between the pulses thus carrying information from the gut. Very sensitive cameras outside the patient (in a darkened room) could also capture light leaking out of the body. Pursued over a course of hours the entire digestive system could be mapped.
Neal Stephenson's book "The Diamond Age" has reference to the use of Lidar for nanoscopic ranging and communications between nanorobots. This article
makes a nice inroads in that direction.
Love these little baby steps in nanolithography. What are we down to now in common practice? about 25 nanometers for some flash memory and processors? Wonder what sub wavelength will add to that. The critical line? "realizable using current technology." That's always sweet.
This is one the best so far, very precise control, high yields for an organic reaction. Also neat how the process restricts the by-products yields.