Summary of slides of 0HM280: Human-Robot
Interaction
Lecture 1
Interaction scenario = Story which is a combination of simple actions to achieve a goal that
the user of a robot wants to accomplish.
Semantic world model = Meaningful description of the world.
Both meaningful and idle motions are similarly lifelike.
Meaningful motions make the robot appear more likeable, intelligent and emotionally
responsive.
Lecture 2
Robot navigation deals with uncertainties such as:
- Noisy sensors
- Outdated maps
- Unknown location
- Inaccurate odometry and dead reckoning
Filters are used to update these uncertainties
Fundamental notion of probability: We can assign real
numbers to a sample of a class of events.
Frequentist interpretation of probability: Frequency of occurrence.
Bayesian interpretation of probability: Probability is a graded belief about an event.
Random variables are used to represent an uncertain outcome.
Discrete
Continuous
X = Random variable. Can be a countable number {x1, x2, …, xn}.
P(X=xi) or P(xi) = The probability that the random variable X taken on the value xi.
P( ) =Probability mass function.
Binomial probability distribution = The number of ‘heads’ when tossing a coin n times;
probability of saying ‘yes’ in a 2AFC task.
Poisson distribution = Number of α particles emitted by a radioactive source; number of
spikes generation by a neuron.
P(X=x) or p(x) = Probability density function.
Uniform probability density
Normal / Gaussion probability density function
- Standard normal distribution μ=0, σ=1
- Random variable that follows a normal distribution
Exponential probability density function
, Often used to model lifetimes or waiting times (usually x is replaced by t in that case)
Continuous probability distributions
- Are densities
Most importantly:
Cumulative probability density function
Also called CDF
Related to practice exam.
Joint probability distribution = A probability mass/density function of more than 1 variable
is called a joint probability distribution.
Discrete: Pr(X=x and Y=y) --> P(x,y)
Continuous: Pr(a<X<b and c<Y<d) --> p(x,y)
If X and Y independent: P(x,y) = P(x) P(y)
Conditional probability = The probability of one variable X for given value of the other
variable Y.
Pr(X|Y=y) (Say probability X “given” Y=y)
It is clearly related to the joint probability with proper value of Y substituted P(X|Y) ∝
P(X,Y=y) .
Discrete:
Continuous:
Likelihood reflects sensory information.
Is a function of hypotheses
Likelihood p(observation | hypothesis)
Prior reflects prior knowledge about hypotheses.
Is independent of observations.
Posterior reflects belief in hypotheses.
, It takes prior knowledge into account.
Lecture 3
Bayes rule:
Interpretation
Considering a robot that wants to know whether a door is
open or not. Then:
P(open|z) is diagnostic
P(z|open) is causal
Often causal knowledge is easier to obtain.
Bayes rule relates causal and diagnostic knowledge in:
If z is updated, we get z1, z2 etc.
According to Markov assumption, zn is independent
of z1, …, zn-1 if we know x.
Often the world is dynamic since:
• Actions carried out by the robot,
• Actions carried out by other agents
• Or just the time passing by change the world.
Actions are never carried out with absolute certainty. They generally increase uncertainty.
To incorporate the outcome of an action u into the current belief, use the conditional pdf:
Interaction
Lecture 1
Interaction scenario = Story which is a combination of simple actions to achieve a goal that
the user of a robot wants to accomplish.
Semantic world model = Meaningful description of the world.
Both meaningful and idle motions are similarly lifelike.
Meaningful motions make the robot appear more likeable, intelligent and emotionally
responsive.
Lecture 2
Robot navigation deals with uncertainties such as:
- Noisy sensors
- Outdated maps
- Unknown location
- Inaccurate odometry and dead reckoning
Filters are used to update these uncertainties
Fundamental notion of probability: We can assign real
numbers to a sample of a class of events.
Frequentist interpretation of probability: Frequency of occurrence.
Bayesian interpretation of probability: Probability is a graded belief about an event.
Random variables are used to represent an uncertain outcome.
Discrete
Continuous
X = Random variable. Can be a countable number {x1, x2, …, xn}.
P(X=xi) or P(xi) = The probability that the random variable X taken on the value xi.
P( ) =Probability mass function.
Binomial probability distribution = The number of ‘heads’ when tossing a coin n times;
probability of saying ‘yes’ in a 2AFC task.
Poisson distribution = Number of α particles emitted by a radioactive source; number of
spikes generation by a neuron.
P(X=x) or p(x) = Probability density function.
Uniform probability density
Normal / Gaussion probability density function
- Standard normal distribution μ=0, σ=1
- Random variable that follows a normal distribution
Exponential probability density function
, Often used to model lifetimes or waiting times (usually x is replaced by t in that case)
Continuous probability distributions
- Are densities
Most importantly:
Cumulative probability density function
Also called CDF
Related to practice exam.
Joint probability distribution = A probability mass/density function of more than 1 variable
is called a joint probability distribution.
Discrete: Pr(X=x and Y=y) --> P(x,y)
Continuous: Pr(a<X<b and c<Y<d) --> p(x,y)
If X and Y independent: P(x,y) = P(x) P(y)
Conditional probability = The probability of one variable X for given value of the other
variable Y.
Pr(X|Y=y) (Say probability X “given” Y=y)
It is clearly related to the joint probability with proper value of Y substituted P(X|Y) ∝
P(X,Y=y) .
Discrete:
Continuous:
Likelihood reflects sensory information.
Is a function of hypotheses
Likelihood p(observation | hypothesis)
Prior reflects prior knowledge about hypotheses.
Is independent of observations.
Posterior reflects belief in hypotheses.
, It takes prior knowledge into account.
Lecture 3
Bayes rule:
Interpretation
Considering a robot that wants to know whether a door is
open or not. Then:
P(open|z) is diagnostic
P(z|open) is causal
Often causal knowledge is easier to obtain.
Bayes rule relates causal and diagnostic knowledge in:
If z is updated, we get z1, z2 etc.
According to Markov assumption, zn is independent
of z1, …, zn-1 if we know x.
Often the world is dynamic since:
• Actions carried out by the robot,
• Actions carried out by other agents
• Or just the time passing by change the world.
Actions are never carried out with absolute certainty. They generally increase uncertainty.
To incorporate the outcome of an action u into the current belief, use the conditional pdf: