How to use time.sleep() within a function correctly?
Python time.sleep() Explained (2026 Guide with Examples, Threads & Async)
python - How do I make a time delay? - Stack Overflow
How accurate is python's time.sleep()? - Stack Overflow
Videos
Long story short i am making a rpg games (text based) in python and need a little help. I am using the time library to give a delay between certain ASCII banners popping up in the console. However, i dont want to have to write out 'time.sleep()' or copy and paste it every time i want to use it. Therefore, i made a function which I would use to shorten the time it would take me to right out 'time.sleep()':
def wait(time):
time.sleep(time)
wait(1)
Whilst in theory i thought this would work (im new to python, i have much to learn yet), it gives me this error:
time.sleep(time)
AttributeError: 'int' object has no attribute 'sleep'
I was wondering if anyone could help/point me in the right direction on how to go about this problem. Thanks in advance!
This delays for 2.5 seconds:
import time
time.sleep(2.5)
Here is another example where something is run approximately once a minute:
import time
while True:
print("This prints once a minute.")
time.sleep(60) # Delay for 1 minute (60 seconds).
Use sleep() from the time module. It can take a float argument for sub-second resolution.
from time import sleep
sleep(0.1) # Time in seconds
The accuracy of the time.sleep function depends on your underlying OS's sleep accuracy. For non-real-time OSs like a stock Windows, the smallest interval you can sleep for is about 10-13ms. I have seen accurate sleeps within several milliseconds of that time when above the minimum 10-13ms.
Update: Like mentioned in the docs cited below, it's common to do the sleep in a loop that will make sure to go back to sleep if it wakes you up early.
I should also mention that if you are running Ubuntu you can try out a pseudo real-time kernel (with the RT_PREEMPT patch set) by installing the rt kernel package (at least in Ubuntu 10.04 LTS).
Non-real-time Linux kernels have minimum sleep intervals much closer to 1ms than 10ms, but it varies in a non-deterministic manner.
People are quite right about the differences between operating systems and kernels, but I do not see any granularity in Ubuntu and I see a 1 ms granularity in MS7. Suggesting a different implementation of time.sleep, not just a different tick rate. Closer inspection suggests a 1μs granularity in Ubuntu by the way, but that is due to the time.time function that I use for measuring the accuracy.
