How is your benchmark and Character Creation?

I got this much right now.

pso20210525_221143_000.jpg

And my OC looking so good...

pso20210526_003348_008.png

Benchmark score.png

Character.png

This is how my computer fared to the benchmark and a slightly changed version of my JP character.

@sheizansc

363 score

Did you try turning your graphics settings down 😓

I took 2 tests. One with the default settings and one with a few small changes:

first default test: pso20210525_152523_000.png

2nd test with slightly lower graphics settings: pso20210525_152921_001.png

I'm going to work on making a male and female character with what is available. The majority of the FPS was above 30 FPS after lowering a few settings, while it reached down to around 15 FPS with the initial settings, but only once and that was the only time it dropped below 30FPS.

My score isn't indicative of me not being able to run. The game would run fine at medium settings, even though it recommends lowest:

alt text

A GTX 1070 gets more than double this score. (I have a GTX 1060.) People might complain that it isn't using my dedicated GPU, but it is. Due to the way NVIDIA+Intel Hybrid GPU Systems work, it'll only show it used the NVIDIA GPU in Task Manager.

My average FPS was about 45. You can see my character creations here: https://forum.pso2.com/topic/11661/character-creation-benchmark-releases-today-post-your-characters-here/11 (below FerrousAradicen's Male cast characters.)

@coldreactive said in How is your benchmark and Character Creation?:

My score isn't indicative of me not being able to run. The game would run fine at medium settings, even though it recommends lowest:

A GTX 1070 gets more than double this score. (I have a GTX 1060.) People might complain that it isn't using my dedicated GPU, but it is. Due to the way NVIDIA+Intel Hybrid GPU Systems work, it'll only show it used the NVIDIA GPU in Task Manager.

It is actually possible to get your CPU to show up on there with an Intel CPU that comes with a GPU. I don't remember how I did it around 6 years ago now, but my computer doesn't even show the Intel GPU in my Device Manager as Disabled or anything. I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip, but I don't remember. With that, or the Intel chip disabled in Devices Manager, it should always be able to detect your NVIDIA like normal.

@Ragnawind said in How is your benchmark and Character Creation?:

I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip

I've done this before. If I disable the Intel GPU or change from the Intel GPU... believe it or not, the Intel GPU is used, and the NVIDIA one becomes disabled. My resolution when I restart after doing this gets reset to 1280 x 720 as a result.

@coldreactive said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip

I've done this before. If I disable the Intel GPU or change from the Intel GPU... believe it or not, the Intel GPU is used, and the NVIDIA one becomes disabled. My resolution when I restart after doing this gets reset to 1280 x 720 as a result.

There must be a bug in your BIOS then, since that should never happen. I just checked mine and it was set from Auto to the option for PCI, which seems to have disabled the Integrated GPU altogether

Looks better from the back. Only close ups for SS are halfway decent.

Can't tell if the main issue is the default face mesh, the height censorship elongating the legs, or a combination of both.

pso20210525_150621_018.jpg pso20210525_150546_012.jpg pso20210525_150505_006.jpg

I'll play along with what I have.

Thing to note: The resolution you're running will be a major influence factor on the score, as it just rates by FPS maintained. I turned off things like V/G sync, fps limiter, and benchmark seems to cap out at 179 fps no matter how powerful your system is. Most players are still playing either 1080p and 720p. 2k or 1440p is a bit more upscale and growing, while 4k is luxury. Setting an FPS limit (useful for GSync) will of course limit the score potential.

This is my daily driver "4k" resolution score. No overclocks, all settings maxed out for the game options. I will be playing at this for the game is gorgeous not to go all out. pso20210525_163520_000.jpg

My initial "tryhard like its 3DMark" 720p score, and 179 fps seems to be the hardcoded limit. So to score higher, will be about maintaining the max fps potential with as minimum fps lows (dips) as possible from happening. pso20210525_163149_000.jpg

I'm still reworking on my character. Uploading the PSO2 saved character file didn't come out nearly as I wanted.

Update: Perfect score achieved after going with bare bones settings. Uninterrupted 179 fps in this case.

pso20210526_021131_000.jpg

Finally, my character with what I'm limited to work with:

pso20210526_104648_003.jpg

pso20210526_104741_006.jpg

i5-6600K, 64GB ram, nvme & SSD drives driving a evga geforce GTX 970 on current drivers set to "application decides":

lowest settings gets a 29122 with FPS averaging over 100, min of 47

ultra settings gets a 2196 with FPS averaging around 50, min 27

It seems the most graphics-intensive section is when the first doll hits the ground at the beginning of the benchmark.

I played nearly the entire beta at ultra, manually tweaking everything to max and it was still playable...

EDIT: Forgot to mention the pair of old Dell 16:10 displays at 1920x1200

@Ragnawind said in How is your benchmark and Character Creation?:

@coldreactive said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip

I've done this before. If I disable the Intel GPU or change from the Intel GPU... believe it or not, the Intel GPU is used, and the NVIDIA one becomes disabled. My resolution when I restart after doing this gets reset to 1280 x 720 as a result.

There must be a bug in your BIOS then, since that should never happen. I just checked mine and it was set from Auto to the option for PCI, which seems to have disabled the Integrated GPU altogether

Are you referring to the old Lucid Virtu MVP stuff? I don't think that was released for any other platforms outside Intel Ivy Bridge, and the Lucidlogix website is pretty much gone...

@PrsnOfDsntrst said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

@coldreactive said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip

I've done this before. If I disable the Intel GPU or change from the Intel GPU... believe it or not, the Intel GPU is used, and the NVIDIA one becomes disabled. My resolution when I restart after doing this gets reset to 1280 x 720 as a result.

There must be a bug in your BIOS then, since that should never happen. I just checked mine and it was set from Auto to the option for PCI, which seems to have disabled the Integrated GPU altogether

Are you referring to the old Lucid Virtu MVP stuff? I don't think that was released for any other platforms outside Intel Ivy Bridge, and the Lucidlogix website is pretty much gone...

No clue. All I know is that I have a Haswell Processor which is a 4th gen i5-4460 with an integrated HD4600. I use an NVIDIA GTX750 Ti right now for graphics though and Windows doesn't even detect the integrated GPU. The computer is an old Acer Aspire TC-705. I also don't even use the Intel GPU for anything anyway, since it is much wekaer than my NVIDIA

@Ragnawind said in How is your benchmark and Character Creation?:

@PrsnOfDsntrst said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

@coldreactive said in How is your benchmark and Character Creation?:

@Ragnawind said in How is your benchmark and Character Creation?:

I think it was a setting in my BIOS to make the NVIDIA the default Graphics Chip

I've done this before. If I disable the Intel GPU or change from the Intel GPU... believe it or not, the Intel GPU is used, and the NVIDIA one becomes disabled. My resolution when I restart after doing this gets reset to 1280 x 720 as a result.

There must be a bug in your BIOS then, since that should never happen. I just checked mine and it was set from Auto to the option for PCI, which seems to have disabled the Integrated GPU altogether

Are you referring to the old Lucid Virtu MVP stuff? I don't think that was released for any other platforms outside Intel Ivy Bridge, and the Lucidlogix website is pretty much gone...

No clue. All I know is that I have a Haswell Processor which is a 4th gen i5-4460 with an integrated HD4600. I use an NVIDIA GTX750 Ti right now for graphics though and Windows doesn't even detect the integrated GPU. The computer is an old Acer Aspire TC-705. I also don't even use the Intel GPU for anything anyway, since it is much wekaer than my NVIDIA

The Lucid thing was an attempt to do something similar to SLI with the intel gpu and a discreet gpu. While a nice idea, it didn't work very well for most people; my own result was rather inconclusive, which is why that system had a pair of overclocked evga gtx 570.

As for the issue at hand, turning on the iGPU in BIOS enables it as an additional independent graphics device--there won't be any load-sharing with a discreet gpu, and you'd still need to load the intel drivers for it. The default display BIOS setting is exactly what it says--which device is the boot device and that Windows would designate "Display 0". If you did go with a display on the Intel and another on the discreet GPU, it would follow that whichever hardware you played the game on would be the one to do the graphics processing.

beda1676-5283-4802-bc14-b01f2ca071d4-image.png

I think I did pretty well. I have my GPU and CPU overclocked too

@LusterMain this was with the unchanged settings hahah

I got around 3.8k with everything turned off and light quality to 20, so I'm gonna be doing that for now.