Incroyable, mais vrai. Microsoft Corporation (NasdaqGS: MSFT) owned server platform's at Docs.com's search functionality exposes Personal Identifiable Information of hundreds - perhaps, should we say thousands - of users... Does the company really believe that dropping search functionality will relieve them of risk?
Why weren't prudent safegaurds put in place to protect the company's users (and the company as well)? At the very least, a check for PII to help to mitigate exposure (risk-wise) to the company? Do they check for malware or evil embedded macros in these documents? Who forgot to check for PII? Was the company's well-seasoned legal department part of the sign off process to this debacle?
"More details on the attacks and proposed countermeasures are available in the research paper titled "Malware Guard Extension: Using SGX to Conceal Cache Attacks." via Catalin Cimpanu at BleepingComputer
Apparently, this product is now embedded in a wide range of devices (ranging from Apple Inc. to Dell Computers and more). I do architect & advise end-point security efforts in my work (agnostic that I am - I do not recommend individual products), but certainly not an embedded product in BIOS or EFI. Could it be rightly called 'The Self-Healing Endpoint of Privacy'? Has a meme been created? You be the judge - Me?, I'm going back to paper and pencil, air-gapped (of course - dammit, air-gaps are no guaranty of secure platforms either...). What to do. Tip o' the Hat.
via Motherboard writer Michael Byrne, comes this well-wrought piece on the apparent proliferation of 'bots on Twitter, ie., the implications of algorithm-driven entities on the Twitterverse. The fascinating component to this study by Onur Varol, Emilio Ferrara, Clayton A. Davis, Filippo Menczer and Alessandro Flammini, was the utilization of a machine-learning apparatus (and the feature-sets therein) to tease out the truth. Additional documentation (in the form of the paper) is available on arXIv. Today's MustRead.
"Part of what makes the new research interesting is the sheer number of features used in the classification model..." - Motherboard's Michael Byrne