You would have to write a dedicated tool to do requests that divides the query into multiple chunks and writes them to a file. You could use the
Special:MyLanguage/API:Categorymembers
api for instance, and list per 500 (limit) and use the continue parameter from request to request. Or get a database dump installed locally and export them from there. A million is a lot, it is not a type of request that the wikimedia optimises for.
Also depending on what you are trying to achieve, there might be alternate/better ways to achieve that goal other than listing a million+ entries in one go.