Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

TimeAgenda ItemLeadNotes
2 min
  • Start recording
  • Welcome & antitrust notice
  • Introduction of new members
  • Agenda review
Chairs
  • Antitrust Policy Notice: Attendees are reminded to adhere to the meeting agenda and not participate in activities prohibited under antitrust and competition laws.
  • ToIP Policy: Only members of ToIP who have signed the necessary agreements are permitted to participate in this activity beyond an observer role.
  • ToIP TSWG IPR Policy: see TF wiki page. AI & Metaverse Technology Task Force
3 mins
  • Introduction of new members
  • Any general announcement, news, that could be of interest to the TF
All
  • Presentation scheduled for the July 19 All Member meeting on "Digital Trust in the age of Generative AI". - Wenjing Chu 
45 mins
  • Unique Identification of 50,000+ Virtual Reality Users from Head & Hand Motion Data
  • Inferring Private Personal Attributes of VR Users from Head and Hand Motion Data

Guest Speaker: Vivek Nair, Hertz Fellow, UC Berkeley

Anita Rao brief intro.


LinkedIn announcement: https://tinyurl.com/mrypn4c3

Vivek Nair develops cutting-edge cryptographic techniques to defend digital infrastructure against sophisticated cyber threats. Nair believes that for every problem that exists in cybersecurity, there is a cryptographic solution waiting to be found. Vivek will present these two recent studies from Berkeley RDI:

Unique Identification of 50,000+ Virtual Reality Users from Head & Hand Motion Data (https://arxiv.org/abs/2302.08927)

Inferring Private Personal Attributes of Virtual Reality Users from Head and Hand Motion Data (https://arxiv.org/abs/2305.1919)

#AIM #ToIP #VR #AR #security #data #digitaltrust #privacy

Notes from Vivek's presentation:

  • First study: Historic study of motion based identification in the 70's
  • Basic same idea fast forwarded - distinctions: large number of more diverse Beat Saber game participants 50K+ to make results statistically more significant and representative - game play recordings, and high identification rate 95+%.
  • The motion results are highly effective, comparable to or stronger than Iris, finger prints etc, while facial recognition is more off the chart
  • Context (scene of the play) info is useful but not a major contributor on its own in this study.
  • You can't hide motion data from the Apps - the motion events are important to the game play, so they have to be shared to the apps.
  • Second study: 
  • Take motion data as "language" - as in "body language".
  • Use transformer based learning to answer what additional personal information it can infer with statistically significance, e.g. weight, height, but also income, country, disability... 40+ attributes of personal info.
  • The "privacy layer" of a VR device also typically send all the significant motion data to all devices in the VR because latency requirements demand that rendering happens in the devices locally.
  • It is as if "you walk on a public square and broadcasting all the personal information".
  • We discussed Apple's VisionPro announcement - and its implications to privacy based on the results of these studies. Vivek: We have a very narrow window in devising a solution to this problem before VR devices, as currently designed, become the next iPhone of the world which we can't live without. 
  • THANKS to Vivek Nair for the wonderful presentation - this is hugely important for all of us! 
  • Encourage everyone to check out additional information:
    • Link to UCB:  
10 minsWhite paper status updates & call for additional blogs & white papers.  (Skipping)

@phil indicated that he might not be able to work on it for a while.

Regular updates from all lead authors: @sandy, Anita Rao  and @wenjing.

Call for additional Blog/white papers.

5 mins
  • Review decisions/action items
  • Planning for next meeting 
  • AOB
Chairs




...