which is the factor that affects brightness of that bulb, current in the circuit or the voltage
Both. It's the power that ultimately causes the filament to get hot and emit visible black body radiation.
Power is voltage times current, so both matter.
However, you can only control one degree of freedom. The bulb dictates the other. This single degree of freedom can be expressed various ways. Two of them are fixing the current and fixing the voltage. Once you fix one of these, the resistance of the bulb implicitly fixes the other.
Note that the resistance of a bulb varies considerably with temperature. It is much higher when the bulb is emitting light than when it is sitting cold and unpowered. However, that still doesn't let you fix both independently. It only means that the relationship between voltage and current changes with the set point.
Answer from Olin Lathrop on Stack Exchangewhich is the factor that affects brightness of that bulb, current in the circuit or the voltage
Both. It's the power that ultimately causes the filament to get hot and emit visible black body radiation.
Power is voltage times current, so both matter.
However, you can only control one degree of freedom. The bulb dictates the other. This single degree of freedom can be expressed various ways. Two of them are fixing the current and fixing the voltage. Once you fix one of these, the resistance of the bulb implicitly fixes the other.
Note that the resistance of a bulb varies considerably with temperature. It is much higher when the bulb is emitting light than when it is sitting cold and unpowered. However, that still doesn't let you fix both independently. It only means that the relationship between voltage and current changes with the set point.
If you think of the lighting element, which may be a length of tungsten or the like, as a fixed resistor of R ohms, the current in the resistor is determined by:
Ohm's Law. The higher the voltage v the higher the current. If the brightness is caused by a flow of electrons through the filament, a higher voltage will--all things being equal-- drive more electrons to flow through the filament.
So the brightness is a function of both current and voltage, and can be said to depend on both. That is, we can write or
We might be tempted to say that the intensity I of the light is proportional to current, but we would have to remember that
is a function of
and put
This can get arbitrarily complicated, since intensity of the light may not be proportional to current in a simple way, and the precise behavior of electrons in a circuit depends on the exact conditions and the nature of the components. For example, as comments below note, the resistance may not be constant, but may be a function of temperature.
An accessible article on how resistance may vary in a light bulb is given here. A plot of I vs. V shows that the resistance R is not in general constant. To characterize the circuit in terms of V and I you might have to solve or take some measurements to get a quantitative idea.
I read about this online and they say its the Power that affects brightness, so it's actually both Current and Voltage. However, when I connected a bunch of lightbulbs together, they had the same brightness when they were connected in parallel regardless of how many lightbulbs I added (even though the current is being split across them so they should be having less power) and when I connected them in series they were less bright for each lightbulb I added.
I have done some searches but it seems the graphs are always in relation of voltage x forward current and forward current x R. luminosity flux. Also, if this has something to do with basic concepts pls do point it out for me. Thx!
LEDs are a very very different beast compared to incandescent light bulbs. LEDs belong to a class of device known as non-linear devices. These don't follow Ohm's Law in the classic sense (however Ohm's Law is still used in conjunction with them).
An LED is (obviously) a form of diode. It has a forward voltage which is the voltage at which the diode starts to conduct. As the voltage increases so does how well the diode conducts, but it does that in a non-linear fashion.
With an LED it's the amount of current flowing through it that determines how bright it is. Increasing the voltage increases the current, yes, but the region where that happens without the current getting too much is very small. In the red curve above it may be that tiny little bit around 1.5V, and by the time you get to 2V the current is off the scale and the LED burns out.
Putting LEDs in series does sum the forward voltages, so you have to provide a higher voltage for conduction to start, but the controllable region is still just as tiny.
So we control the current instead of the voltage, and take the forward voltage as a fixed value. By either including a resistor in the circuit to fill the gap between the supply voltage and the forward voltage, limiting the current in the process, or by using a constant current supply, we can set the current that we want to flow through the LED and thus set the brightness. By increasing the current, but not increasing the voltage (or only a negligible amount, and purely incidentally), we increase the brightness.
The formula for calculating the resistance to use for a specific current is:
Where \ is the supply voltage, \
is the LED forward voltage, and \
is the desired LED forward current.
No, an LED by itself (no resistors or other electronics) behaves quite differently from a light bulb.
Have a look at this datasheet of a random LED.
Scroll down to the page with many graphs. The third graph shows the relative intensity (light) versus current through the LED:
(Source: 334-15/T1C1-4WYA datasheet)
You'll notice that this curve is somewhat linear, meaning twice the current would give you roughly twice as much light.
What have we learned: a LED's brightness is somewhat proportional to the current flowing through it.
But what current do you get for a certain voltage ?
Look at graph 2:
(Source: 334-15/T1C1-4WYA datasheet)
Forward current vs forward voltage, notice how the current increases rapidly for a voltage above 3 Volt. Only 0.5 V more gives 4 x the current! This curve also changes between LEDs and over temperature.
That is why it is better to feed LEDs with a current instead of a voltage. If you feed a LED a with voltage, the current is not very predictable so neither is the brightness. Also the power fed to the LED will then vary as Power is voltage x current.
It is better to keep a LED at a constant current so that is why series resistors are needed, these limit the current to the intended value. Not exactly but close enough for most purposes.
With the series resistor in place a LED (+ resistor) somewhat behave more like a lightbulb in the sense that the change in brightness is more proportional to the voltage you apply.