Submission Details

avatar

- Extreme League

1566 marks with AMD Ryzen 7 2700X at 3530MHz

Ranking position

n/a

Global 8x CPU rank:

1st

Ryzen 7 2700X rank:

Cup Gold 1st out of 1

Points earned for overclocker league

  • _mat_ has chosen to disable points for this submission.

Points earned for team league

Media gallery

BIOS/UEFI Settings
Analyze Share XTU settings XTU hardware
ROG Rampage VI Apex
ROG Rampage VI Apex
Trident Z RGB
Trident Z RGB
Verification URL, image, checksum
valid

Hardware details

CPU details

  • Model: AMD Ryzen 7 2700X 'Pinnacle Ridge'
  • Cooling: Air (Custom)
  • Temperature (°C): 59 (load)
  • Cores: 3,530MHz(-4.59%)

Memory details

VGA details

Mainboard details

Disk details

Power details

  • Manufacturer: Seasonic
  • Series: PRIME
  • Power: 1,000 Watt

Recent Comments

Austria_mat_ commented on his own score:

Love my new Ryzen 2700X with 18 Cores. :D

Australiazeropluszero says:

are you going to give us the download link for FXTU or what?

Austria_mat_ says:

I will eventually. But first I want you to know the full implications of this tool. CPU-Z will take a hit as well, although not every platform. Skylake X can't be helped though: https://valid.x86.fr/5qy93f

Australianewlife says:

Well you just broke overclocking

mickulty says:

why

Australiazeropluszero says:

imagine being poor leegoofd and having to wake up to this bullshit every day. 

United KingdomGeorgeStorm says:

As 'funny' as this is, does it actually help anything or just make hwbot even less viable?

Austria_mat_ says:

The reasons why I am publishing this right now:
1) I have first talked to benchmark vendors (including Intel) and they know about these problems, yet they do nothing.
2) HWBOT knows about this, yet there isn't even an official statement on the current state of benchmark security.
3) I've shown this on the HWBOT forums and nobod really cared

To sum it up: Nobody cares when actually everybody involved should. Benchmarks can't be taken seriously in their current state. There are not enough security measures implemented, the way timers are used is unreliable and the concept of handling results and hardware information is just plain wrong. I can go into details if anybody is interested.

Additionally every benchmark developer is on it's own to implement the necessary "features" above. That leads to inconsistent quality/reliability of the benchmarks and their results. This was made pretty clear after the time skewing debacle and the mess it has left. But also smaller "attacks" have followed like the Nehalem dual socket phenomenom where people really believed that those old Xeons were world material. These problems hit hard because old benchmark versions have to be excluded from ranking or legacy  benchmarks get more difficult to moderate.

All that leads to trust issues among overclockers, especially new ones. And that's totally understandable because even I am coming across GPUPI results that I can't comprehend. It's not only about cheating but about trusting the mechanisms of the benchmarks, the hardware information gathering and the timers. So we compare the peformance of hardware with benchmarks that are not built with reliability in mind. Even with GPUPI 3 I am right now only chasing problems, not preventing them in the first place.

So what needs to be done is a uniform mechanism for timing, result handling and hardware detection that all benchmarks can use. The validation logic needs to be transfered to the submission server as well, so decisions for result exclusion can be done at any point, even for the past.

So why am I showing this? Because things need to be taken seriously from all sides and have to be changed for the better.

United Statesyosarianilives says:

2 hours ago, _mat_ said:

So what needs to be done is a uniform mechanism for timing, result handling and hardware detection that all benchmarks can use. The validation logic needs to be transfered to the submission server as well, so decisions for result exclusion can be done at any point, even for the past.

What would it take to implement a wrapper for every benchmark that will use your timers?

Austria_mat_ says:

If I would do something like that, it would be one additional application that runs next to the legacy benchmarks. It injects proper time functions into the executable so it will be able to use my Reliable Benchmark Timer driver. It's kind of an emulation for week Win32 API timer functions. The wrapper could also handle screenshot taking and uploading, some security checks and hardware detection.

All that would need some changes in the HWBOT submission or - and that's something I would prefer - its own validation site. It will show a screenshot with a watermark that adds a timestamp and some hardware information that has to match the uploaded values on the site.

Will need something similiar for the new GPUPI 4 as well, so basically it's something I am going to do. Without the screenshot watermark of course, that's a workaround for legacy benchmarks.

The big problem with this is, that some benchmarks need Windows XP to perform at their best. On XP there is no such thing as a reliable timer and I won't write an alternate driver for some minor security checks. The application could work though for screenshot timestamping to have at least some validation. But I need to think that over.

Another question: Would anyone here donate for a project like this?

mickulty says:

4 hours ago, _mat_ said:

The reasons why I am publishing this right now:
1) I have first talked to benchmark vendors (including Intel) and they know about these problems, yet they do nothing.
2) HWBOT knows about this, yet there isn't even an official statement on the current state of benchmark security.

What responsible disclosure deadline did you give them?  Did you share steps to reproduce with Intel, and any other active benchmark developers?

Belgiumleeghoofd says:

Deadline ?  I think most are aware that a good programmer can break any benchmark out there. Even the wrapper equipped ones. However does that lead to any responsability affiliated to HWBOT?  HWBot needs to do a post on benchmark security or integrity?  Really?  Honestly no idea what some people have been drinking but I want a jerrycan of that moonshine to keep me trying to run this place...

IF everybody runs the benchmarks as they are designed to be without e.g. any downclocking, pulling cables or changing the benchmarks settings we would not have all these rules and bugged scores. I would even dare to claim that the users are more to blame than the site that hosts the scores in it's "special way"

If Matt can design a wrapper for eg the Cinebenches/Wprimes that would be cool and I would be happy to donate to make these 2D benchmarks more bulletproof. XP is going down anyway, so if you can make it compatible from Win7 and up, it would be perfectly fine for futureproofing these legacy benchmarks

Austria_mat_ says:

3 hours ago, mickulty said:

What responsible disclosure deadline did you give them?  Did you share steps to reproduce with Intel, and any other active benchmark developers?

I reported it in autumn 2017 and detailed the vulnerabilities to Intel and Futuremark. It was full disclosure, I talked to the XTU devs and some 3DMark devs. Massman accompanied the process and we were searching for a solution back then. He knew that this was (and is) an important step for HWBOT to validate results, even tried to finance it and helped wherever he could. Sadly it was too close to his departure so I haven't come up with a viable solution at that point.

3 hours ago, Leeghoofd said:

However does that lead to any responsability affiliated to HWBOT?  HWBot needs to do a post on benchmark security or integrity?  Really?

Yes, I think that HWBOT is responsible for the quality of benchmark submissions. I don't criticize the manual moderation process/your work. You are dedicating a lot of effort into this and it's very much appreciated. But if you want HWBOT to be still taken seriously as a platform to host and validate world records, rankings and comparable results you should at least not ignore the fact that there is a problem. You don't even need to come up with a solution, but accept it, include yourself in the discussion and contribute if you can. To blame it on overclockers in general is NOT the solutions. Most play by the rules, buy expensive hardware and bench for hours to get credit for their achievements. But a single "OnePageBook" devaluates these efforts just by editing a screenshot in what ... 3 minutes tops? That's a problem that overclocking suffers from since I can remember. Let's try to solve it together.

mickulty says:

5 hours ago, _mat_ said:

It was full disclosure

I don't think that means what you think it means, full disclosure would be immediately making everything public including a guide to using the exploit.

Responsible disclosure is the practice of reporting vulnerabilities to those who would fix them first, giving them everything they would need to fix it, and then setting a clear deadline for when you go public.  By setting a clear deadline you encourage action rather than people just hoping the vulnerability isn't found by bad guys, this is the way that for example google project zero do it and is standard practice in infosec.  Obviously it's not suitable where the benchmark isn't actively maintained, but for XTU it seems appropriate.

Austria_mat_ says:

I didn't know that was an infosec term, thanks for clarifying. It's my first report, never done that before.

I didn't set a date, I somehow believed back then that this will be fixed of course.

Please log in or register to comment.