NEIL

NEIL (Never Ending Image Learning) has been running at Carnegie Mellon University in the US since July 2013, 24 hours a day. Guided by computer vision algorithms, it scans millions of images from the Internet. Its purpose is to create a massive visual knowledge base that will contain representations/associations of common sense between objects, actions, properties and relationships of affinity that it identifies in images. According to the project’s contributors, human intervention is limited to correcting any erroneous associations. NEIL’s motto ‘I crawl, I see, I learn’ aims, beyond Google images, to also include videos from YouTube as a source of knowledge.

After hours of YouTube binging, one can notice the mediation of their patterns in the next video that appears, as the one they would expect to see. A machine that is “burning” for image crawling and YouTube certainly has more to learn than someone who is “burning” watching videos – even if the raw material is common. For whom would such an intensive accumulation of knowledge-from-the-internet be useful? The funding comes from Google and from the Office of Naval Research, the research office of the US Navy.

The operation of NEIL was published as news on several technology news websites with Associated Press as the source in November 2013. Regarding the funding sources and purposes of the project, the following are mentioned:

Neither Google nor the Office of Naval Research responded to questions regarding the reasons for funding NEIL, but there are some indications. The Office of Naval Research’s website notes that “the environment of modern battlefields is much more complex than in the past” and that “the rate at which data reaches the decision-making system is increasing, while the number of people available to convert data into actionable intelligence is decreasing.” In other words, computers might be making some decisions in future wars. The Navy’s website states: “In many operational scenarios, human presence is not an option.”

If objections to the mechanization of knowledge and the corresponding transformations of labor are difficult to establish from the outset in a clear way as such, there may be some other point of departure / foundation for them. Funding with a clearly antisocial – in reality, warlike – character does not exist only in U.S. universities, but everywhere. Companies, the military, and universities for years now have not responded to such questions (regarding purposes), as long as these are not posed in a competitive, activist manner.

cyborg #04 – 10/2015