StudentsEducators

Planck’s Law

Planck's Law describes the electromagnetic radiation emitted by a black body in thermal equilibrium at a given temperature. It establishes that the intensity of radiation emitted at a specific wavelength is determined by the temperature of the body, following the formula:

I(λ,T)=2hc2λ5⋅1ehcλkT−1I(\lambda, T) = \frac{2hc^2}{\lambda^5} \cdot \frac{1}{e^{\frac{hc}{\lambda kT}} - 1}I(λ,T)=λ52hc2​⋅eλkThc​−11​

where:

  • I(λ,T)I(\lambda, T)I(λ,T) is the spectral radiance,
  • hhh is Planck's constant,
  • ccc is the speed of light,
  • λ\lambdaλ is the wavelength,
  • kkk is the Boltzmann constant,
  • TTT is the absolute temperature in Kelvin.

This law is pivotal in quantum mechanics as it introduced the concept of quantized energy levels, leading to the development of quantum theory. Additionally, it explains phenomena such as why hotter objects emit more radiation at shorter wavelengths, contributing to our understanding of thermal radiation and the distribution of energy across different wavelengths.

Other related terms

contact us

Let's get started

Start your personalized study experience with acemate today. Sign up for free and find summaries and mock exams for your university.

logoTurn your courses into an interactive learning experience.
Antong Yin

Antong Yin

Co-Founder & CEO

Jan Tiegges

Jan Tiegges

Co-Founder & CTO

Paul Herman

Paul Herman

Co-Founder & CPO

© 2025 acemate UG (haftungsbeschränkt)  |   Terms and Conditions  |   Privacy Policy  |   Imprint  |   Careers   |  
iconlogo
Log in

Shannon Entropy Formula

The Shannon entropy formula is a fundamental concept in information theory introduced by Claude Shannon. It quantifies the amount of uncertainty or information content associated with a random variable. The formula is expressed as:

H(X)=−∑i=1np(xi)log⁡bp(xi)H(X) = -\sum_{i=1}^{n} p(x_i) \log_b p(x_i)H(X)=−i=1∑n​p(xi​)logb​p(xi​)

where H(X)H(X)H(X) is the entropy of the random variable XXX, p(xi)p(x_i)p(xi​) is the probability of occurrence of the iii-th outcome, and bbb is the base of the logarithm, often chosen as 2 for measuring entropy in bits. The negative sign ensures that the entropy value is non-negative, as probabilities range between 0 and 1. In essence, the Shannon entropy provides a measure of the unpredictability of information content; the higher the entropy, the more uncertain or diverse the information, making it a crucial tool in fields such as data compression and cryptography.

Legendre Polynomials

Legendre polynomials are a sequence of orthogonal polynomials that arise in solving problems in physics and engineering, particularly in potential theory and quantum mechanics. They are defined on the interval [−1,1][-1, 1][−1,1] and are denoted by Pn(x)P_n(x)Pn​(x), where nnn is a non-negative integer. The polynomials can be generated using the recurrence relation:

P0(x)=1,P1(x)=x,Pn+1(x)=(2n+1)xPn(x)−nPn−1(x)n+1P_0(x) = 1, \quad P_1(x) = x, \quad P_{n+1}(x) = \frac{(2n + 1)x P_n(x) - n P_{n-1}(x)}{n + 1}P0​(x)=1,P1​(x)=x,Pn+1​(x)=n+1(2n+1)xPn​(x)−nPn−1​(x)​

These polynomials exhibit several important properties, such as orthogonality with respect to the weight function w(x)=1w(x) = 1w(x)=1:

∫−11Pm(x)Pn(x) dx=0for m≠n\int_{-1}^{1} P_m(x) P_n(x) \, dx = 0 \quad \text{for } m \neq n∫−11​Pm​(x)Pn​(x)dx=0for m=n

Legendre polynomials also play a critical role in the expansion of functions in terms of series and in solving partial differential equations, particularly in spherical coordinates, where they appear as solutions to Legendre's differential equation.

Root Locus Analysis

Root Locus Analysis is a graphical method used in control theory to analyze how the roots of a system's characteristic equation change as a particular parameter, typically the gain KKK, varies. It provides insights into the stability and transient response of a control system. The locus is plotted in the complex plane, showing the locations of the poles as KKK increases from zero to infinity. Key steps in Root Locus Analysis include:

  • Identifying Poles and Zeros: Determine the poles (roots of the denominator) and zeros (roots of the numerator) of the open-loop transfer function.
  • Plotting the Locus: Draw the root locus on the complex plane, starting from the poles and ending at the zeros as KKK approaches infinity.
  • Stability Assessment: Analyze the regions of the root locus to assess system stability, where poles in the left half-plane indicate a stable system.

This method is particularly useful for designing controllers and understanding system behavior under varying conditions.

Perron-Frobenius

The Perron-Frobenius theorem is a fundamental result in linear algebra that applies to positive matrices, which are matrices where all entries are positive. This theorem states that such matrices have a unique largest eigenvalue, known as the Perron root, which is positive and has an associated eigenvector with strictly positive components. Furthermore, if the matrix is irreducible (meaning it cannot be transformed into a block upper triangular form via simultaneous row and column permutations), then the Perron root is the dominant eigenvalue, and it governs the long-term behavior of the system represented by the matrix.

In essence, the Perron-Frobenius theorem provides crucial insights into the stability and convergence of iterative processes, especially in areas such as economics, population dynamics, and Markov processes. Its implications extend to understanding the structure of solutions in various applied fields, making it a powerful tool in both theoretical and practical contexts.

Knuth-Morris-Pratt Preprocessing

The Knuth-Morris-Pratt (KMP) algorithm is an efficient method for substring searching that improves upon naive approaches by utilizing preprocessing. The preprocessing phase involves creating a prefix table (also known as the "partial match" table) which helps to skip unnecessary comparisons during the actual search phase. This table records the lengths of the longest proper prefix of the substring that is also a suffix for every position in the substring.

To construct this table, we initialize an array lps\text{lps}lps of the same length as the pattern, where lps[i]\text{lps}[i]lps[i] represents the length of the longest proper prefix which is also a suffix for the substring ending at index iii. The preprocessing runs in O(m)O(m)O(m) time, where mmm is the length of the pattern, ensuring that the subsequent search phase operates in linear time, O(n)O(n)O(n), with respect to the text length nnn. This efficiency makes the KMP algorithm particularly useful for large-scale string matching tasks.

Indifference Curve

An indifference curve represents a graph showing different combinations of two goods that provide the same level of utility or satisfaction to a consumer. Each point on the curve indicates a combination of the two goods where the consumer feels equally satisfied, thereby being indifferent to the choice between them. The shape of the curve typically reflects the principle of diminishing marginal rate of substitution, meaning that as a consumer substitutes one good for another, the amount of the second good needed to maintain the same level of satisfaction decreases.

Indifference curves never cross, as this would imply inconsistent preferences. Furthermore, curves that are further from the origin represent higher levels of utility. In mathematical terms, if x1x_1x1​ and x2x_2x2​ are two goods, an indifference curve can be represented as U(x1,x2)=kU(x_1, x_2) = kU(x1​,x2​)=k, where kkk is a constant representing the utility level.