Myriad Security Diagnostic Testing (Important!)

PERIOD PAINS

About those 6 dots......

I’ve still got that NAGGING feeling we need to be paying more attention to the Placement / Location of the STRICTLY-SIX periods in EVERY dataset . . . . . .
… and perhaps “in relation to” the letters around them…
… and maybe encoding their positions as Integers . . .

COLON-OSCOPY…?

2 down, 4 to go?

But how in Blazes is the : supposed to fit into all this?
We’ve “lost 2” as a result of 31422 and 85671 being shut down…
So, perhaps we can at least conclude that the datasets are “ONLY” dependent on their own, and maybe only their paired-server’s, dataset…
((That is to say, none of the data hinges on anything outside its associated country zone / paired server info. This “makes sense” if different people around the World are “meant to focus” on ONE set of data primarily, being “their” servers…))

2 B or not 2 B…?

EMEA servers -buggered-

See, if it was INTENTIONAL that the TestConn data for 31427 and 85671 resulted in DIFFERING locations (Oz/NZ/APAC vs UK/EU/EMEA) having the fastest TestConns, then they’d “already planned” to remove those 2 from the equation.

But nothing else of significance was “found” with them (yet), and the data is probably just as complex to decipher…

So I’d err on the side of believing that there was “a right bit of a c0ck-up” with the TestConn figures for 31422, since it was “rightly” paired to 85671, whose TestConns were CONCLUSIVELY UK->EU->EMEA … so Theoretically, so too should 31422 have been.

OK, that aside… if “losing” EMEA was “no big deal” (to them), then perhaps NONE of the datasets with COLONS in them are that important, either.

GOING FORWARD / DECIPHER STAGE

I’d recommend a BIGGER FOCUS on the S.Am. and US WEST datasets, now.

(3) 11986 + 10 (60206) = S.Am. … and … (4) 67240 + (8) 40144 = US West
((PS: @Desdanova - yes, you’re right; they’re all Paired: Primary + Backup!))

No Colons.
AND, it seems, it was IMPORTANT enough for them to AMEND the Instructions to that 25+ (not 50+) Mbps thing I pointed out earlier, which someone else (on Reddit) seemed convinced was related to (or important for) one of the US West servers… :smiley:

… and here, I take a break . . . . . .

1 Like

Good find my friend, however we have the information. scroll up - keep posting your finds though please. Too much info is fine by me.

I will post it for you

Edit: Done! @Desdanova

2 Likes

Lol, that looks familiar :wink:
Here is the link by the way. Might be useful and easier to copy from if anyone wishes:

Hello again, @DevilinPixy. :slight_smile:
I did a thing with that data, and more specifically with the image @Espilonarge made…
Back up here…

Note: that was BEFORE it was revealed / confirmed that N.America was actually SPLIT into West and East… :blush:

1 Like

I am just now catching up. I posted that data last night on Pastebin for others to use. There is some interesting stuff going on with the paired data, which becomes quite apparent when you line up the full strings. I just figured to include the link to make it easier, as you can’t really copy the data from an image if you wish to mess with it yourself.

2 Likes

Curious where we are at now. Did we get any new clue? I did see the STATUS change, but wonder if any other additional info has been released. What is expected from us to do next?

2 Likes

Right now, based on the recent status update we are trying to decrypt the servers fetched data. The implication in the statement is that it is possible with our current information.
I don’t know if we are reading too much into it though, as we didn’t really decrypt anything besides the server pairs and they said keep it up.

1 Like

Yeah, as to my knowldge, this is the latest communications posted to Reddit via satcom-70:

1 Like

Direct response to @Argent-Star from satcom-70 on reddit

satcom-70• 12m

When attempting to decrypt the fetchdata output, keep in mind important information such as connection quality and associated backup server.

1 Like

CREDIT WHERE DUE

Non nobis...

Oh no, dear @SingularGleam – I cannot take credit!
It wasn’t “direct” to moi; but I found that reply to someone else.
Only tried to make it “tidy” to share, so I had Reddit give me a “direct link”…

It is, originally, as @LilLadyD76 shows in her phone-app screen-cap above - a response to gosh-what’s-his-name suggesting that 31422 was the “problem server” just cuz its TestConn data was returning Oz (APac) as Fastest, when it should’ve been UK (EMEA).

SPECULATIONS

IF ... THEN

Far as I’m concerned, that was a Satcom muck-up… oops.

… and as I say elsewhere (recent-ish…),
if they can do these two things:
(1) discard Servers #1 (31422 ) & #5 (85671) so “easily”; whilst
(2) ensuring a v public correction of the Instructions (Upload Best from 50+ to 25+);
… then they:
(1) don’t “need” (for us to have) Servers #1 and #5; but
(2) DO “need” us to have Servers #4 (67240) and #8 (40144) – as I’d noted someone else comment that the correction “affected” how we’d look at these 2 (somehow).

Extrapolating from that, I’m sticking my neck out to Suggest that we:
(1) pay less attention to ALL data containing a “:” (colon); and
(2) pay MORE attention to the other 2 pairs – S.Am., and Am.West.

So that’s #3 (11986) + #10 (60206 … and … #4 (67240) + #8 (40144).

QUOTING OFFICIAL INSTRUCTIONS

Sources: PDF + Reddit u/satcom-70

As discussed earlier w/ @SingularGleam, @DevilinPixy, if the INSTRUCTIONS read…
“Make sure to always use the best available connection when analyzing the contents of these servers.”

AND paying attention to u/satcom-70’s direct response (to THIS Redditor here):
“… Best connection values for each server required for decryption. …”

… then finding a way to use the TestConnection data, in some way, as a DECRYPT KEY, is what we’re now looking for…

CURRENT THINKING

Next Steps...

I suggested to Singular that we might try using one Server’s TestConn values as D-Key for the other Server( in the Pair)'s datastring, and vice-versa for each Pair.

I’m presently having a little play-around with casting a “visual eye” over paired datasets, and trying to figure out the role of the PERIODS within each string…

[Edit]
AS IT TURNS OUT

I HAVE received a “direct response” from u/satcom-70:
“… , keep in mind important information such as connection quality and associated backup server.”

Luckily, this goes hand-in-hand with what I suggested to Singular
“connection quality” = the TestConn figures
“assoc backup server” = using the BACKUP’s dataset.

CRITICALLY, I think this is also a SLIP-UP –

“connection quality and associated backup” – I happen to read this as “connection quality of the Primary Server and associated data of the backup server”.
A-HA!! :smiley:
[/edit]

2 Likes

I’m relaxing and slowly trying things - I have taken the frequency analysis, and used the extra characters from each pair to see if anything hops out.
I have done the 2nd and 3rd pair (counting from the eliminated pair) and no joy. I tried the basic stuff, hex, dec, base64 etc. couldn’t make a link or words from them.

Here - this may not be anything, but there was effort involved so I figured I would share it.
User X| Vanquish |X on Discord made this, the explanation is inside.

If the servers are paired, then why is the data different between them. Shouldn’t a backup have a copy of the primary? In the IT world that is the reason for a backup.

Now I hope that the data isn’t stripped between servers. :unamused: If that is the case, then we need to be combining the data strings in order to decrypt them.

I don’t quite get it either, because if for example server 2 gets the data from backup server 7, then why do I not get the data directly when getting the data from 7, but instead get it returned from 2 again. Each server is basically a main server as well as a backup server, depending on which one you fetch data from.

I don’t quite understand that sheet, as it appears to beselectively removing the following parameters: myriad.exe, atlas.exe, csd.exe, and satcom.exe. Why not remove them from each string part, like done with the rest?

1 Like

I honestly couldn’t say - I wasn’t able to make sense of it either - I thought he HAD just removed them from each string.

Interesting.
Here’s sthg to consider…

If the DIFFERENCE in LENGTH of the (paired) datasets HAPPENS TO CORRESPOND with the CHARACTER LENGTH of those filenames (ie incl. “.”)…
myriad.exe = 10
atlas.exe = 9
csd.exe = 7
satcom.exe = 10
… then removing these filenames would tend to “equalise” the datasets…

I’m interested in seeing HOW SIGNIFICANT it is that the data is presented to us in 8 SEPARATE LINES (of up to about 20chars? each), which are NOT equal in length.
Are we then “supposed to” take certain bits (sorry bytes!) of data OUT of each row, so that all 8 become “equal length”, before processing further…???

Is each of the 8 rows of data PER SERVER supposed to be a PHRASE, perhaps?

2 Likes

Maybe I am misinterpreting, but if you move for example exe from all strings, then there would noy be a single e or x in any of the resulting strings. However, I do see them still present.

1 Like

According to someone’s previous per-server-dataset character count, there were more than 2 Es and more than 1 X per dataset, so you would still see other Es and Xs “after removing” one set of “exe” from the string…

Curiously and CRITICALLY, though, is WHERE the 2 Es and 1 X “should” be taken from, as it would naturally affect the ORDERING of all the other characters in the set.

1 Like