In 2012, Palantir quietly embedded itself into the daily operations of the New Orleans Police Department. There were no public announcements. No contracts made available to the city council. Instead, the surveillance company partnered with a local nonprofit to sidestep oversight, gaining access to years of arrest records, licenses, addresses, and phone numbers all to build a shadowy predictive policing program.

Palantir’s software mapped webs of human relationships, assigned residents algorithmic “risk scores,” and helped police generate “target lists” all without public knowledge. “We very much like to not be publicly known,” a Palantir engineer wrote in an internal email later obtained by The Verge.

After years spent quietly powering surveillance systems for police departments and federal agencies, the company has rebranded itself as a frontier AI firm, selling machine learning platforms designed for military dominance and geopolitical control.

"AI is not a toy. It is a weapon,” said CEO Alex Karp. “It will be used to kill people.”

  • StupidBrotherInLaw@lemmy.world
    link
    fedilink
    English
    arrow-up
    26
    arrow-down
    2
    ·
    2 days ago

    Colour me surprised, Bloomcole with another shit-tier take. I only don’t block them because they by far have my lowest Voyager tracked user score, -83 as of this comment, and I’m curious to see how low it’ll go simply by incidental Bloomcole exposure.

    • AHamSandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      1
      ·
      2 days ago

      Checked their post history, did not disappoint. Check out where they respond to their own moderated, removed comment in a two week old dead post. They had a bit of a tantrum when I pointed that out.

    • gwilikers@lemmy.ml
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      Ooooh. I had a feeling this was a thing. Got here via another post and was just blown away by how bad this take was.