Calculate the time out of sample points with set frequency

In summary, the data is coming in clumps and the software is incorrectly assigning timestamps to the data. You are trying to determine the delay in acquiring the data but don't know how to do it.
  • #1
Kurzma
1
0
Hi all,

at the moment I am doing my Master Thesis and have the following problem.

I am trying to measure Data and asign it a proper timestamp. My problem is, that the data is coming in bursts and the timestamps I assign with the software are wrong.
The controller I am using for monitoring the data should run at 100 Hz, but it doesn't. I checked the Data while applying a sinudial 100 Hz signal and there appeared constant peaks every 10 datapoints and a sinudial slow drift, in contrary to the expected horizontal line.

Now I am trying to determinate the delay in aquiring the data (peaks in pointdata) So we got the equation:
f(t)=Asin(wt) + C ?
f ~= 100 Hz
w = 2
9be4ba0bb8df3af72e90a0535fabcc17431e540a
f
A= data acquired
C= could be calculated

and somehow just calculate t ? But it doesn't feel right...

Also I came up with another approach:
f(tx) = x = Asin(wt + O)
f(ty) = y = Asin(wt + d + O)
x - y = Asin(wt + O) - Asin(wt + d + O)

I would really appreciate any help with this. I am a real doofus when it comes to calculus especially with harmonic oscillators. I browsed the whole day trigonometrical identities and didn't really progress far.

I hope I explained everything properly, If anything is unclear please feel free to ask me.
I also uploaded an sample from my data.

Best Regards
 

Attachments

  • testing_100Hz.xlsx
    2.3 MB · Views: 219
Physics news on Phys.org
  • #2
I'm making some guesses here, but this seems to be what you are describing:
You are collecting data from some source - apparently a signal that can be replace with a sine wave generators.
The data you are receiving at the software end is not only being received in clumps, but is apparently being collected in clumps. So that sine wave isn't looking as smooth as it should.

My guess is that the data collection rate is being controlled through software timing - rather than a regular series of hardware-generated pulses.

For example, if you are collecting data through an analog-to-digital converter, the sampling signal to the ADC should not be driven directly by the software. It is okay for the software to have some executive control over the signal (start, stop, frequency/period selection), but it should not be generating the timing pulses themselves.
 
  • Like
Likes berkeman
  • #3
Let me add something: The experiment you're doing with the sine wave is good - assuming that the period of the sine wave is slower that the sampling period - but not a whole lot slower. Slower by a factor of 3 to 12 would be fine. I would collect 16384 samples and apply an FFT (using, for example, MatLab) and then check for a very strong spike.

But I am also presuming that you have already concluded that the data is not being collected at regular intervals. That should be all you need to know. Determining the delay might provide a useful diagnostic for finding a problem in a system that should work.

You haven't told us what data collection system you are using and why you think that system should work. So the first place I am looking is at the basic design of you data collection system.
 
  • #4
Kurzma said:
The controller I am using for monitoring the data should run at 100 Hz, but it doesn't. I checked the Data while applying a sinudial 100 Hz signal and there appeared constant peaks every 10 datapoints and a sinudial slow drift, in contrary to the expected horizontal line.
I looked at your data and reread you post.
I am not sure what you want. Have you concluded that the controller is not collecting the data correctly, but you want to continue using that controller anyway?
Normally you would fix your data collection system and recollect the data. Perhaps you cannot recollect the data?
 

Related to Calculate the time out of sample points with set frequency

1. How do you calculate the time out of sample points with set frequency?

To calculate the time out of sample points with set frequency, you can use the formula: Out of sample time = (Number of sample points x Frequency) / 60. This will give you the time in minutes. If you want the time in seconds, you can multiply the result by 60.

2. What does "out of sample" mean in this context?

"Out of sample" refers to data points that were not included in the original sample or dataset. In other words, these are new or unseen data points that were not used to create the initial model.

3. Why is it important to calculate the time out of sample points?

Calculating the time out of sample points can provide valuable information about the performance of a model. It can help identify any potential issues or discrepancies between the model's predictions and the actual data. It can also help in determining the appropriate frequency for collecting new data points.

4. Can this calculation be applied to any type of data or model?

Yes, this calculation can be applied to any type of data or model as long as the time intervals are consistent. It is particularly useful for time series data, where the frequency of data collection is important.

5. How can I use the calculated time out of sample points?

The calculated time out of sample points can be used to evaluate the performance of a model and make any necessary adjustments. It can also be used to determine the amount of time needed to collect new data points in order to keep the model up to date and accurate.

Similar threads

  • Biology and Chemistry Homework Help
Replies
1
Views
1K
  • General Engineering
Replies
5
Views
2K
Replies
5
Views
402
Replies
9
Views
858
  • Classical Physics
Replies
17
Views
1K
  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Advanced Physics Homework Help
Replies
5
Views
3K
  • Introductory Physics Homework Help
Replies
27
Views
3K
  • Introductory Physics Homework Help
Replies
5
Views
2K
  • Introductory Physics Homework Help
Replies
5
Views
1K
Back
Top