How to estimate the running time of an algorithm based on the time complexity of an algorithm and the value range of some variables?

Is it possible that there are some formulas to calculate the running time of the algorithm based on some hardware configurations and the above information?

Or is it possible that there is a table (referring to the hardware configuration of the general evaluation machine) to judge?

For example: $$$O(\log n)$$$ can pass when $$$n≤10^{13}$$$, $$$O(n)$$$ can pass when $$$n≤$$$___... Can the above ideas realize the problems raised above?

If you can, please advise the specific method; if not, please enlighten me.

Thank you!

**UPD: There is another idea I don't know if it is feasible: get the time for the hardware to execute a statement, and then directly multiply it by the time complexity of the algorithm to get the final result? This method should work as long as the average execution time of a statement is known.**

**UPD: UPD: I now know that the normal evaluation machine can perform $$$10^7 \sim 10^8$$$ operations per second. The problem is solved like this! Thank you!**