Recent content by Inferior89

  1. I

    Bayesian inference of Poisson likelihood and exponential prior.

    Hey I have some problems understanding my statistics homework. I am given a data set giving the number of calls arriving to different switchboards in three hours as well as the total phone call duration in minutes for each switch board. Something like i y_i t_i -------------- 1...
  2. I

    A really fast wuestion about a partial derivative

    Assuming that x is measured in meters.
  3. I

    Average force exerted on pedalf tangent to their circular path of a bike

    The work that needs to be done will be mgh where m is the mass of the bike + cycle, g is the gravitational acceleration and h is the vertical height of the hill. The total length that the wheels will need to turn will be 180/sin(8.2 degrees). Now you can calculate how many times the wheel will...
  4. I

    Understanding Derivative: Solving for f'(x) in Different Functions

    f(x) = -3(2x^2 - 5x + 1) f'(x) = 12x +15 This is wrong. f'(x) = -12x + 15 is correct.
  5. I

    A really fast wuestion about a partial derivative

    Wait.. Is your prof saying that d/dx alpha*x^2 = 0?
  6. I

    Integrating (2/a)[200sin(3πx)sin(nπx/a)] from 0 to a

    Because your delimiters are wrong. Don't use "[" especially when you don't have a closing "]". http://www.wolframalpha.com/input/?i=integrate+(2/a)(200sin(3*Pi*x)sin(n*pi*x/a))++dx+from+0+to+a
  7. I

    Integrating (2/a)[200sin(3πx)sin(nπx/a)] from 0 to a

    If you only want to find an answer you can use www.wolframalpha.com and tell it: "integrate f(x) dx from 0 to a" where f(x) is your function. However, it is always good to know how to do stuff by hand. I think two integrations by parts should work.
  8. I

    Proving the Dot Product of Orthonormal Vectors in a Matrix

    Ah shiet. That is right.. I should have read more carefully. To OP, What I have said so far works for orthonormal matrices but that is not what you have lol. Sorry.
  9. I

    Proving the Dot Product of Orthonormal Vectors in a Matrix

    You did one steep too far. (Ux)(Uy) = (x^T)(U^T)(Uy) = (x^T)[(U^T)(U)](y). Now use what you know about (U^T)(U).
  10. I

    Proving the Dot Product of Orthonormal Vectors in a Matrix

    Yes, use this together with Ux = x^T U^T, matrix multiplication is associative and that x . u = x^T u.
  11. I

    How do I prove that a sequence is open?

    So basically you can get as close to x as you want as long as you chose a N big enough. Do you see that this together with that A is open will give you the required result?
  12. I

    Proving the Dot Product of Orthonormal Vectors in a Matrix

    Do you know anything about inverses of orthonormal matrices?
  13. I

    How do I prove that a sequence is open?

    The first sentence of your post should be "Let A be an open set" I guess.. .\quad A (--------x-) And then you have a sequence of number that converge to x. What is the definition of a sequence converging to a number?
  14. I

    How Do You Isolate dy/dx in Implicit Differentiation?

    I made a mistake when getting the 2xy term. It should have been 4xy. Then they are the same. I fixed my stuff above. (You might need to refresh the page to see the changes).
  15. I

    How Do You Isolate dy/dx in Implicit Differentiation?

    In the second line you forgot the 2xy term but then it popped back on the third line in the right hand side. The answer in the end looks sort of right.
Back
Top