Posts

transverse load

Last edited: August 8, 2025

A load perpendicular to the long end of a rod. Think of a metal rod lying flat on the ground; a transverse

Tree File Organization

Last edited: August 8, 2025

TrustPOMDP

Last edited: August 8, 2025

Humans either over-rely (drive a Tesla while sleeping) or under rely (interfering) with robot’s actions.

  1. human and robot interactions may depend on entire history
    1. trust is a proxy for the full interaction history
  2. the human’s policy must be modeled by th robot
  3. trust is demonstrated through real-world experimentation

Formulation

Add two variable

  • Trust: \(\theta_{t}\), the robot’s ability to succeed in a task
  • Performance: \(e_{t+1}\), success or failure in attempting a task

the trust model probabilities for model’s correct modeling of humans are low: high variance between participants.

Tuning Fork

Last edited: August 8, 2025

Tuning Forks (funing torks!) is a Tuning Fork. You smack it and it goes “biiing!”

Let’s figure out how it works. For us to be one same page, let’s define some vocab:

Vocab

  • “Tine”: one of the two prongs of the fork

A Cursory Explanation

Source: here and here. Both are not very scientific but a good first step.

From a very basic perspective, hiting a tuning fork creates a transverse wave on the tine you hit, which vibrates and then compresses the air around it in a longitudinal fashion at a set frequency, which we hear as a sound.

turing machine

Last edited: August 8, 2025

constituents

  • \(Q\) is a finite set of states
  • \(\Sigma\) is the input alphabet, where \(\square \not \in \Sigma\)
  • \(\Gamma\) is the tape alphabet, where \(\square \in \Gamma\), and \(\Sigma \subseteq \Gamma\) (because we can write empty cells)
  • \(\delta: Q \times \Gamma \to Q \times \Gamma \times \qty {L, R}\)
  • \(q_0 \in Q\), the start state
  • \(q_{a} \in Q\), the accept state
  • \(q_{r} \neq q_{a}\in Q\), the reject state (because a Turing Machine may not terminate at end of input)

requirements

additional information

why TM is awesome

  1. a Turing Machine \(M = \qty(\Gamma, Q, S)\) should be ready for inputs of any length \(n\) (in particular, when designing \(M\), we don’t know what \(n\) will be)
  2. a Turing machine’s computation is “local”—you can’t look at the whole input, and the composition of these many local steps
  3. No ambiguity as to runtime: how many times you apply \(\delta\) before you get to Qaccept or Qreject

Church-Turing thesis as local steps

“computation is any process that takes place in a sequence of simple, local steps.”