DEV Gargaj wrote:

Perpetuum.exe -adapter N will be in the next version, as well as storing the window positions.

Awesome work, I am starting to get VERY impressed by the speed at which the DEVs of this game operate and respond to user feedback.

Count this as a further extension of my accounts real soon!

DEV Zoom wrote:

Confirmed, will be fixed asap.
(#3535)

Thank you very much!

I just found this out, bear with me!

Not long ago new functionality was introduced to manage your own geo-scanner result folders. These co-exist next to the ones your geo-scanner creates automatically when you upload results.

When you want to remove no longer used results, you can either individual scans manually or choose to delete the whole folder.

Now, here it comes....

When you delete the whole folder, say Titan ore, that was created for you automatically....you delete all Titan ore scans...no matter in what folder you placed those results!!!

Here I was thinking, hey neat option, this will save me from deleting each individual scan after a mining operation! Now I am shocked to discover it deleted all my scan results of said ore, even those I set aside in different folders explicitly for later visits!

Please address this issue as soon as possible, I think it is a result of forgetting to modify some older code when the option was created to add your own folders.

Thanks in advance!

Tailn wrote:

I really don't understand the problem here, the card that the window is running on (ie the monitor) is the card that does the rendering.  To change the card that the game is being rendered by simply drag the window to another monitor.

This is not true, at least not on ATI cards as far as I can tell.  As soon as you move one client window to another monitor, frame rates go from near 60 to around 8 to 12.

Tailn wrote:

Alternatively have your clients icon on the desktop of the monitor that you would like the client to start on.  of course to do this you need to be running in windowed mode (you are doing that are you not?)

Yes I am running in windowed mode, else there would be only 1 display visible and it would still run from the primary card.

Tailn wrote:

Also your little know fact is definitely not little or unknown that is absolute basics of multi-gpu rendering as it is currently implemented. I also take it that you are not running 6xxx series cards then as if you were you could run all 3 monitors of the on card (display port).

You would be surprised and in fact I do run on a duel 6970 setup. And contrary to your suggestion, it cannot connect all three displays without additional converters since my monitors are true display port and not mini-display port as seems to be the rage these days. And the secondary DVI on these cards is not dual link, so my 30" displays running 2560x1200 resolution cannot be handled by that connector either. So without using converters this arrangement is the only one possible!

Besides connecting everything to one card without special support from the game software would not use the secondary graphics card and would therefore be pointless.

Tailn wrote:

All adding a render device option ingame will do is tell windows which monitors to exclude, you will also need to specify an attached monitor as well as card for that to work as you would like.
It is impossible with out coding to use cuda / OpenCL to offload processing then send results back to the primary card through the bus completley un-optimal and full of bottlenecks, or use crossfireX.

As explained CrossFile X cannot work given all these constraints, manual assignment of resources is needed. And my proposed solution does in fact fix the issue, as it does also work in EVE where I got the exact same situation.

That game does have the option to select one out of three adapters (those are the visualized ones for each display I have connected) and when I assign them properly, it runs very smooth. When i assign it in a way that there is no 1:1 relationship between the card it being renderer on and the display it is being displayed on, performance drops to 3-6 fps for obvious reasons!

In short, these sort of games are better tuned manually, its not an FPS where you only run one client and enlist a second card to beef up the FPS to unreal levels. The difference is night and day, and also consider the benefit of not having 4 clients compete for video memory for textures and the like. It is much better partitioned across two cards without special assumption riddled magic.

The purpose of this request is to make use of multiple graphics cards when available. At this moment there are situations where multiple accounts cause the frame rate to drop to abysmal regardless of graphics settings. This is NOT because one card has to do all the render work, but because the card that the display is connected too is NOT the card doing the rendering. All render output for display has to be transferred from one card to the other, and this incurs a horrific penalty in efficiency.

My situation is as follows (lets take this as a real world example):

3 monitors, two 2GB ATI cards (top of the line)

My primary card, controls a 30" display and a 24" display).
My secondary card, controls another 30" display.

This arrangement is needed as not all these displays can be hooked to one card, as is required for crossfireX. That and the fact that most games that you will play with multiple clients concurrently, will not support crossfireX anyway, make this the optimal arrangement.

A little known fact is that when you do enable crossfireX, in this setup you end up with only two displays as all  displays need to be connected to one card, which is not possible in this case.

The solution is quite elegant and simple, add a drop down or a command line switch to select the display to render to. Each connected display results in a "virtual" video card to be created and is visible to the client software. And as long as the output window is rendered on the same card as the display is connected too, everything works like a charm with perfect scaling. It is an easy way to use a secondary card for games like this.

Please....pretty please, can we have this soonish(tm)?