Even in quantum computers, many calculations are done using traditional circuits. What usually happens is that a bunch of calculations are done using traditional microprocessors and then there will be some hard algorithm that needs to run, things like prime factorisation or brute force search. There are the algorithms that a quantum computer can do much faster.
At this point, the input data - which is in bits - is converted into qubits. After that the computation is done in the quantum system. Finally the answer from the quantum system is measured and that output will be in bits again.
All this is interesting, but how does it work in practise?
Normal computer bits can represented physically in various forms. In olden days there were punch cards where a hole in a paper card represented whether the bit value was zero or one. Today bits are represented as elecromagnetic charges. In a similar way, qubits can be represented physically in many ways. For example, electrons have a property called 'spin'. The spin can either be 'up' or 'down'. We could take an elecron and set the spin to 'up' and say that this represents the qubit in state and an elecron with spin 'down' represents a qubit in state . This is not the only representation, you could also use properties of neutrinos, photons (light particles), phonons (sound particles) etc.
Traditional computing uses logic gates to manipulate the bits, and in a similar vein, quantum logic has its own 'quantum gates' that can manipulate the qubits. In the case of electrons you would use electromagnetic fields to change its state, while a system built on photons whould use mirrors and splitters. Physics of quantum mechanics would apply to determine what the output would be. Since quantum mechanics works in a probabilistic space, all operations are probabilistic, and this represents the changing weights on the qubit.
At the end of the computation, we measure the spin of the electon, and we get an answer 'up' or 'down'. That represents one or zero and further processing is by traditional computing again.