StudentsEducators

Michelson-Morley

The Michelson-Morley experiment, conducted in 1887 by Albert A. Michelson and Edward W. Morley, aimed to detect the presence of the luminiferous aether, a medium thought to carry light waves. The experiment utilized an interferometer, which split a beam of light into two perpendicular paths, reflecting them back to create an interference pattern. The key hypothesis was that the Earth’s motion through the aether would cause a difference in the travel times of the two beams, leading to a shift in the interference pattern.

Despite meticulous measurements, the experiment found no significant difference, leading to a null result. This outcome suggested that the aether did not exist, challenging classical physics and ultimately contributing to the development of Einstein's theory of relativity. The Michelson-Morley experiment fundamentally changed our understanding of light propagation and the nature of space, reinforcing the idea that the speed of light is constant in all inertial frames.

Other related terms

contact us

Let's get started

Start your personalized study experience with acemate today. Sign up for free and find summaries and mock exams for your university.

logoTurn your courses into an interactive learning experience.
Antong Yin

Antong Yin

Co-Founder & CEO

Jan Tiegges

Jan Tiegges

Co-Founder & CTO

Paul Herman

Paul Herman

Co-Founder & CPO

© 2025 acemate UG (haftungsbeschränkt)  |   Terms and Conditions  |   Privacy Policy  |   Imprint  |   Careers   |  
iconlogo
Log in

Marshallian Demand

Marshallian Demand refers to the quantity of goods a consumer will purchase at varying prices and income levels, maximizing their utility under a budget constraint. It is derived from the consumer's preferences and the prices of the goods, forming a crucial part of consumer theory in economics. The demand function can be expressed mathematically as x∗(p,I)x^*(p, I)x∗(p,I), where ppp represents the price vector of goods and III denotes the consumer's income.

The key characteristic of Marshallian Demand is that it reflects how changes in prices or income alter consumption choices. For instance, if the price of a good decreases, the Marshallian Demand typically increases, assuming other factors remain constant. This relationship illustrates the law of demand, highlighting the inverse relationship between price and quantity demanded. Furthermore, the demand can also be affected by the substitution effect and income effect, which together shape consumer behavior in response to price changes.

Microrna Expression

Microrna (miRNA) expression refers to the production and regulation of small, non-coding RNA molecules that play a crucial role in gene expression. These molecules, typically 20-24 nucleotides in length, bind to complementary sequences on messenger RNA (mRNA) molecules, leading to their degradation or the inhibition of their translation into proteins. This mechanism is essential for various biological processes, including development, cell differentiation, and response to stress. The expression levels of miRNAs can be influenced by various factors such as environmental stress, developmental cues, and disease states, making them important biomarkers for conditions like cancer and cardiovascular diseases. Understanding miRNA expression patterns can provide insights into regulatory networks within cells and may open avenues for therapeutic interventions.

Fama-French Three-Factor Model

The Fama-French Three-Factor Model is an asset pricing model that expands upon the traditional Capital Asset Pricing Model (CAPM) by including two additional factors to better explain stock returns. The model posits that the expected return of a stock can be determined by three factors:

  1. Market Risk: The excess return of the market over the risk-free rate, which captures the sensitivity of the stock to overall market movements.
  2. Size Effect (SMB): The Small Minus Big factor, representing the additional returns that small-cap stocks tend to provide over large-cap stocks.
  3. Value Effect (HML): The High Minus Low factor, which reflects the tendency of value stocks (high book-to-market ratio) to outperform growth stocks (low book-to-market ratio).

Mathematically, the model can be expressed as:

Ri=Rf+βi(Rm−Rf)+si⋅SMB+hi⋅HML+ϵiR_i = R_f + \beta_i (R_m - R_f) + s_i \cdot SMB + h_i \cdot HML + \epsilon_iRi​=Rf​+βi​(Rm​−Rf​)+si​⋅SMB+hi​⋅HML+ϵi​

Where RiR_iRi​ is the expected return of the asset, RfR_fRf​ is the risk-free rate, RmR_mRm​ is the expected market return, βi\beta_iβi​ is the sensitivity to market risk, sis_isi​ is the sensitivity to the size factor, hih_ihi​ is the sensitivity to the value factor, and

Keynesian Trap

The Keynesian Trap refers to a situation in which an economy faces a liquidity trap that limits the effectiveness of traditional monetary policy. In this scenario, even when interest rates are lowered to near-zero levels, individuals and businesses may still be reluctant to spend or invest, leading to stagnation in economic growth. This reluctance often stems from uncertainty about the future, high levels of debt, or a lack of consumer confidence. As a result, the economy can remain stuck in a low-demand equilibrium, where the output is below potential levels, and unemployment remains high. In such cases, fiscal policy (government spending and tax cuts) becomes crucial, as it can stimulate demand directly when monetary policy proves ineffective. Thus, the Keynesian Trap highlights the limitations of monetary policy in certain economic conditions and the importance of active fiscal measures to support recovery.

Stochastic Gradient Descent

Stochastic Gradient Descent (SGD) is an optimization algorithm commonly used in machine learning and deep learning to minimize a loss function. Unlike the traditional gradient descent, which computes the gradient using the entire dataset, SGD updates the model weights using only a single sample (or a small batch) at each iteration. This makes it faster and allows it to escape local minima more effectively. The update rule for SGD can be expressed as:

θ=θ−η∇J(θ;x(i),y(i))\theta = \theta - \eta \nabla J(\theta; x^{(i)}, y^{(i)})θ=θ−η∇J(θ;x(i),y(i))

where θ\thetaθ represents the parameters, η\etaη is the learning rate, and ∇J(θ;x(i),y(i))\nabla J(\theta; x^{(i)}, y^{(i)})∇J(θ;x(i),y(i)) is the gradient of the loss function with respect to a single training example (x(i),y(i))(x^{(i)}, y^{(i)})(x(i),y(i)). While SGD can converge more quickly than standard gradient descent, it may exhibit more fluctuation in the loss function due to its reliance on individual samples. To mitigate this, techniques such as momentum, learning rate decay, and mini-batch gradient descent are often employed.

Von Neumann Utility

The Von Neumann Utility theory, developed by John von Neumann and Oskar Morgenstern, is a foundational concept in decision theory and economics that pertains to how individuals make choices under uncertainty. At its core, the theory posits that individuals can assign a numerical value, or utility, to different outcomes based on their preferences. This utility can be represented as a function U(x)U(x)U(x), where xxx denotes different possible outcomes.

Key aspects of Von Neumann Utility include:

  • Expected Utility: Individuals evaluate risky choices by calculating the expected utility, which is the weighted average of utility outcomes, given their probabilities.
  • Rational Choice: The theory assumes that individuals are rational, meaning they will always choose the option that maximizes their expected utility.
  • Independence Axiom: This principle states that if a person prefers option A to option B, they should still prefer a lottery that offers A with a certain probability over a lottery that offers B, provided the structure of the lotteries is the same.

This framework allows for a structured analysis of preferences and choices, making it a crucial tool in both economic theory and behavioral economics.