February 23, 2025

7 thoughts on “Information theory: Is the light on or off

  1. Sam, but what if I don’t want to be subjected to dataveillance? Do I have a choice any longer?

    The implications of this questions scare me, as they are so far from everything that I’ve been taught about the American ideal. Why should I throw that away? For what benefit am I, or society, obtaining via these advanced forms of ‘Technique’?

  2. Great question. We know that security through obscurity is a thin easily penetrated veil. So that leads to non-participation as an avenue. Still likely not even possible. To some dataveilance is good. Fear drives abandonment of freedom. So that leads to the question of value which is likely predicated on personal point of view. The value is in societal control which is also where the scary stuff is too. It is a loaded gun with a hair trigger pointed at freedom of censure. But, what is unknown is who has their finger on the trigger. Dataveilance is how we secure society, and provide safety. What is not discussed is the cost to our eroded freedoms.

  3. “But, what is unknown is who has their finger on the trigger. ”

    Well, isn’t it whoever is best positioned to process OODA fastest on the widest possible data set, where that data set is private or public. If my assessment is correct, than the “who” is the Pentagon and it’s universe of high tech affiliate companies.

    How does the individual, as we currently know it, survive this societal framework?

  4. If privacy is a concern then there is very little to be done. Give up using credit cards and shopping cards. Go to a cash based system (strange sounding only a decade into plastic but still viable). Only use non-telemetry equipped media and entertainment (no cable TV for you). Use data obfuscation and signal obscuration technologies like TOR and anonymous proxies. You don’t have to give up all technology just moderate ahead of time how you use it. Each person has a digital footprint and the use of any technology can provide (following the metaphor) a footfall that might be detected. There are a couple books out there on how to minimize that footprint. Maybe a blog post is in order on that.

  5. “If privacy is a concern then there is very little to be done.”

    Thomas Jefferson is rolling in his grave. What did so many Americans give their lives for… a global network of Sousveillence?

    “Each person has a digital footprint and the use of any technology can provide (following the metaphor) a footfall that might be detected. ”

    With dataveillence tech like Palantir.com and bit.ly, social systems are able to be modeled at quite a granular level. Therefore, feedback control system of significant dexterity are only inevitable, especially as dataveillence migrated to real-time, affecting all within that society, irrespective of any personal choices to opt-out of any given single technique within that system.

    Who are those that are asking the societal question about the impact of these techniques on humanity, and why is this entire discussion outcast to the desert of the real, where no one can hear it or participate in this discussion? Is anyone at Purdue, perhaps in the philosophy dept., exploring these questions publicly?

    By the way, here’s bit.ly’s public facing data scientist in a video presentation showing off their stuff.
    http://www.youtube.com/watch?v=G6_UtrZsiBo

  6. Ok, the music background to the Youtube video was kind of eery 🙂

    As to topic. Most Americans feel that they have given up nothing and expected nothing from surveillance culture. There simply has been zero data to support that surveillance is even perceived by Americans as an issue except in tiny/small patches of aware populations. Even criminal elements begin to ignore public camera systems after awhile.

    There are a few people asking the broader social questions but there is no support for that work. I have worked with EFF and other groups (see Amicus brief in my CV) dealing with the abuse of tools by law enforcement. Other people have looked at the ethical nature of some of the tools. I personally believe that the tools as they are dual use are nothing more than tools. It is the specific use and goal of use with the tools that is reprehensible.

    It is interesting that you mention opt-out versus opt-in. Opt-out as a structure for protection requires a fully engaged and committed individual with knowledge of the ramifications of their decision to be morally and ethically correct. However, opt-in is the better default condition when setting up regulatory choices. It is the less ambiguous choice and does not violate the unknowing, or unaware. Obviously business doesn’t like that second model at all.

  7. I wish we had a choice about opting-in or out, but that choice does not really exist, does it? And if we cannot master the tools (ie ‘technique’), then how can we assure ourselves that they serve society? Let me provide the following excerpt…

    The Future does not Compute
    http://www.netfuture.org/fdnc/ch25.html

    An Inability to Master Technology

    “Jacques Ellul says much the same thing when he points to the ‘Great Innovation’ that has occurred over the past decade or two. The conflict between technology and the broader values of society has ceased to exist. Or, rather, it has ceased to matter. No longer do we tackle the issues head on. No longer do we force people to adapt to their machines. We neither fight against technology nor consciously adapt ourselves to it. Everything happens naturally, ‘by force of circumstances,’

    ‘….because the proliferation of techniques, mediated by the media, by communications, by the universalization of images, by changed human discourse, has outflanked prior obstacles and integrated them progressively into the process. It has encircled points of resistance, which then tend to dissolve. It has done all this without any hostile reaction or refusal …. Insinuation or encirclement does not involve any program of necessary adaptation to new techniques. Everything takes place as in a show, offered freely to a happy crowd that has no problems. ‘ (quote from The Technological Bluff)

    It is not that society and culture are managing to assimilate technology. Rather, technology is swallowing culture.”

    The Technological Bluff, by Jacques Ellul
    http://books.google.com/books?id=QXIDzfx19SkC&pg=PA18&lpg=PA18

Comments are closed.