Department of Computer
Science and Engineering
St.
Joseph’s College of Engineering and Technology, Palai, Kerala, India
Abstract:- We are currently using smart phone sensors to detect
physical activities. The sensors which are currently being used are
accelerometer, gyroscope, barometer, etc. Recently, smart phones, equipped
with a rich set of sensors, are explored as alternative platforms for human
activity recognition. Automatic recognition of physical activities – commonly
referred to as human activity recognition (HAR) – has emerged as a key
research area in human-computer interaction (HCI) and mobile and ubiquitous
computing. One goal of activity recognition is to provide information on a
user’s behavior that allows computing systems to proactively assist users with
their tasks. Human activity recognition requires running classification
algorithms, originating from statistical machine learning techniques. Mostly,
supervised or semi-supervised learning techniques are utilized and such
techniques rely on labeled data, i.e., associated with a specific class or
activity. In most of the cases, the user is required to label the activities
and this, in turn, increases the burden on the user. Hence, user- independent
training and activity recognition are required to foster the use of human
activity recognition systems where the system can use the training data from
other users in classifying the activities of a new subject.
Keyword:- Human Activity Recognition
I.
Introduction
Mobile phones or smart phones are rapidly becoming the
central computer and communication device in people’s lives. Smart phones,
equipped with a rich set of sensors, are explored as an alternative platform
for human activity recognition in the ubiquitous computing domain. Today’s
Smartphone not only serves as the key computing and communication mobile device
of choice, but it also comes with a rich
set of embedded sensors [1], such as an accelerometer, digital compass,
gyroscope, GPS, microphone, and camera. Collectively, these sensors are
enabling new applications across a wide variety of domains, such as healthcare,
social networks, safety, environmental monitoring, and transportation, and give
rise to a new area of research called mobile phone sensing. Human activity
recognition systems using different sensing modalities, such as cameras or
wearable inertial sensors, have been an active field of research. Besides the
inclusion of sensors, such as accelerometer, compass, gyroscope, proximity,
light, GPS, microphone, camera, the ubiquity, and unobtrusiveness of the phones
and the availability of different wireless interfaces, such as WI-Fi, 3G and
Bluetooth, make them an attractive platform for human activity recognition. The
current research in activity monitoring and reasoning has mainly targeted
elderly people, or sportsmen and patients with chronic conditions.
The percentage of elderly people
in today’s societies keep on growing. As a consequence, the problem of
supporting older adults in loss of cognitive autonomy who wish to continue
living independently in their home as opposed to being forced to live in a
hospital. Smart environments have been developed in order to provide support to
the elderly people or people with risk factors who wish to continue living
independently in their homes, as opposed to live in an institutional care. In
order to be a smart environment, the house should be able to detect what the
occupant is doing in terms of one’s daily activities. It should also be able to
detect possible emergency situations. Furthermore, once such a system is
completed and fully operational, it should be able to detect anomalies or
deviations in the occupant’s routine, which could indicate a decline in his
abilities. In order to obtain accurate results, as much information as possible
must be retrieved from the environment, enabling the system to locate and track
the supervised person in each moment, to detect the position of the limbs and
the objects the person interacts or has the intention to interact with. Sometimes,
details like gaze direction or hand gestures [1] can provide important
information in the process of analyzing the human activity. Thus, the
supervised person must be located in a smart environment, equipped with devices
such as sensors, multiple view cameras or speakers.
For more information go on below link.
No comments:
Post a Comment