I'd say a lot of energy is going into socializing geeks, and lets not pretend that's a new thing. Engineers tend to have awesome responsibilities and our justified fears of bad engineering encourage society to put them through rigorous training in people skills, starting with the handicap that they don't have any (that's the stereotype to begin with).
In actuality, engineers end up with lots of people skills, rivaling lawyers, which accounts for some of the contention in the area of licenses. How do we collaborate on the basis of a clear understanding of the intellectual property rules. The NSA was at OSCON this year to tell us about it, and to specifically solicit the input of "big companies" whose province our little Portland Hub has become. This talk, which I attended, was more reflective of Washington, DC's ways of thinking than anything else. We at OSCON maybe didn't appreciate the growing pains, when it comes to USG mandates (plans to run on 20% open source code is commendable, if maybe an awkward codification of the principle).
More worrisome than the social skills of GRUNCH workers, are the hard coded biases that may sneak into conditioned reflexes we have little control over, always the Terminator Scenario. Lets step back from nuclear war (the debate team favorite) and just talk about simple face recognition, accepting you're human. On Planet of the Apes, our AI bots may have a hard time treating us as human, Global U students though we be.
IBM is encouraging us to focus on natural disasters, if not their prevention, the management of their consequences. China has some cities waiting for transplants, were the infrastructure in place for population upheavals. But they're not. Rubber boats. Uncooled trucks. Refugees are not accustomed to much appropriate technology in their camp. The technology is so far ahead of its deployment potential, and therefore languishes for lack of use. Prophecies are self-fulfilling.
Translating human language using state of the art deep (or not so deep) neural nets is helping scrub some forms of bias from the stack, versus hard-wiring it in. That's the good news. As humans get better at thinking "cross platform", they'll block less on differences that don't make a difference. That's a kind of "freezing" in thinking we want to root out. The Narnia book Ice Queen was such an inducer of semi-paralysis. You'd stop improving your thinking and get lazy relying on what always worked in the past. Learning is life long now.
The icosahedron is big this year. I see it in Onnx, Cognitive Toolkit, and I believe HyperLedger, a reference blockchain. That's good news for Synergetics (an outgrowth of New England Transcendentalism) wherein the Icosahedron symbolizes a kind of Apollonian braininess associated with geeks, software engineers.
"AI Should be Open" is the theme of the talk I'm in. Blogging while taking in a talk is not considered rude in this culture. Lots of geeks have their laptops open besides me. ONNX version 1 would be my slide of the moment, but I've got the camera packed away, having pigged out on photo ops in the booth crawl this morning.
Data sets that pretend to know what "people of color" means without the services of a translator, are pushing it, in terms of thinking they know what the bias is. One of our keynotes was by a speaker identifying with her female black student labels, and she eldered our group to improve our social skills, not just as individuals, but as managers of data sets.
Don't go home to your insurance company or Capital One (or Home Depot) thinking that just because you're an engineer, you have no standards when it comes to social engineering. On the contrary, engineers have their guilds with reputations. If you have unwanted biases, you will need to look at them first.
In actuality, engineers end up with lots of people skills, rivaling lawyers, which accounts for some of the contention in the area of licenses. How do we collaborate on the basis of a clear understanding of the intellectual property rules. The NSA was at OSCON this year to tell us about it, and to specifically solicit the input of "big companies" whose province our little Portland Hub has become. This talk, which I attended, was more reflective of Washington, DC's ways of thinking than anything else. We at OSCON maybe didn't appreciate the growing pains, when it comes to USG mandates (plans to run on 20% open source code is commendable, if maybe an awkward codification of the principle).
More worrisome than the social skills of GRUNCH workers, are the hard coded biases that may sneak into conditioned reflexes we have little control over, always the Terminator Scenario. Lets step back from nuclear war (the debate team favorite) and just talk about simple face recognition, accepting you're human. On Planet of the Apes, our AI bots may have a hard time treating us as human, Global U students though we be.
IBM is encouraging us to focus on natural disasters, if not their prevention, the management of their consequences. China has some cities waiting for transplants, were the infrastructure in place for population upheavals. But they're not. Rubber boats. Uncooled trucks. Refugees are not accustomed to much appropriate technology in their camp. The technology is so far ahead of its deployment potential, and therefore languishes for lack of use. Prophecies are self-fulfilling.
Translating human language using state of the art deep (or not so deep) neural nets is helping scrub some forms of bias from the stack, versus hard-wiring it in. That's the good news. As humans get better at thinking "cross platform", they'll block less on differences that don't make a difference. That's a kind of "freezing" in thinking we want to root out. The Narnia book Ice Queen was such an inducer of semi-paralysis. You'd stop improving your thinking and get lazy relying on what always worked in the past. Learning is life long now.
The icosahedron is big this year. I see it in Onnx, Cognitive Toolkit, and I believe HyperLedger, a reference blockchain. That's good news for Synergetics (an outgrowth of New England Transcendentalism) wherein the Icosahedron symbolizes a kind of Apollonian braininess associated with geeks, software engineers.
"AI Should be Open" is the theme of the talk I'm in. Blogging while taking in a talk is not considered rude in this culture. Lots of geeks have their laptops open besides me. ONNX version 1 would be my slide of the moment, but I've got the camera packed away, having pigged out on photo ops in the booth crawl this morning.
Data sets that pretend to know what "people of color" means without the services of a translator, are pushing it, in terms of thinking they know what the bias is. One of our keynotes was by a speaker identifying with her female black student labels, and she eldered our group to improve our social skills, not just as individuals, but as managers of data sets.
Don't go home to your insurance company or Capital One (or Home Depot) thinking that just because you're an engineer, you have no standards when it comes to social engineering. On the contrary, engineers have their guilds with reputations. If you have unwanted biases, you will need to look at them first.