Jump to content

Bad password when going into caves


Recommended Posts

Hi,

I'm trying to set up a server with caves using docker, right now i'm running 2 containers, 1 for the overworld and 1 for caves. Those 2 containers are linked but not correctly as when i try to enter a cave i got a bad password error and got disconnected.

My docker-compose.yml :

Spoiler

dst:
  image: jamesits/don-t-starve-together-dedicated-server:latest  
  ports:
    - 10999:10999/udp
    - 10888
  restart: always
  volumes:
    - ./dst/master/server_config/:/data/DoNotStarveTogether
dst-slave:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10998:10998/udp
  restart: always
  links:
    - dst:dstmaster
  volumes:
    - ./dst/slave/server_config/:/data/DoNotStarveTogether

Logs on docker when i try to enter a cave :

Spoiler

 

dst_1       | Activating portal[10] to 2148122485    
dst_1       | Serializing user: session/FA82730C55382A03/KU_pO6nTfE0_/0000000007
dst_1       | [Shard] Migration request: (KU_pO6nTfE0) to Caves(2148122485)
dst_1       | [Shard] Begin migration #1 for (KU_pO6nTfE0)
dst_1       | [Shard] #1 [SHDMASTER](1) -> Caves(2148122485)
dst_1       | [Shard] #1 <- session/FA82730C55382A03/KU_pO6nTfE0_/0000000007
dst-slave_1 | [Shard] #1 -> session/118366ABB8FFADF3/KU_pO6nTfE0_/0000000007
dst-slave_1 | [Shard] Received migration #1 data for (KU_pO6nTfE0)
dst_1       | CloseConnectionWithReason: ID_DST_SHARD_SILENT_DISCONNECT
dst_1       | [Steam] SendUserDisconnect for '76561198025473082'
dst_1       | [Shard] (KU_pO6nTfE0) disconnected from [SHDMASTER](1)

dst_1       | [Shard] Migration for KU_pO6nTfE0 timed out.
dst_1       | [Shard] Cancelling pending migration #1 for (KU_pO6nTfE0)
dst_1       | [Leave Announcement] [FR]Wapaca
dst-slave_1 | [Shard] Migration by user ('KU_pO6nTfE0') failed (174). 
dst-slave_1 | [Leave Announcement] [FR]Wapaca

 

cluster.ini for master

Spoiler

 

[GAMEPLAY]
game_mode = survival
max_players = 5
pvp = false
pause_when_empty = true

[NETWORK]
server_port = 10999
lan_only_cluster = false
offline_cluster = false
enable_snapshots = true
enable_autosaver = false
cluster_description = -=|Social-Survival|=- || (Caves linked but bug)
cluster_name = [BETA][FR/EN] Wapaca's victims
server_intention = social
cluster_password = 
cluster_intention = social

[MISC]
max_snapshots = 5
console_enabled = false

[SHARD]
shard_enabled = true
is_master = true
master_ip = 127.0.0.1
master_port = 10888
bind_ip = 0.0.0.0
cluster_key = mypass

 


cluster.ini for slave

Spoiler

 

[GAMEPLAY]
game_mode = survival
max_players = 5
pvp = false
pause_when_empty = true


[NETWORK]
server_port = 10998
lan_only_cluster = false
offline_cluster = false
enable_snapshots = false
enable_autosaver = true
cluster_description = caves
cluster_name = [BETA][FR/EN] Wapaca's victims
server_intention = social
cluster_password = 
cluster_intention = social


[MISC]
max_snapshots = 0
console_enabled = false

[SHARD]
shard_enabled = true
name = Caves
is_master = false
bind_ip = 0.0.0.0
master_ip = dstmaster
master_port = 10888
cluster_key = mypass

 

You can also check both screenshots to see what happen from the player side.

Do you have any idea to fix this error ? I can give you more log/conf files if you need.

Thanks for your help, i keep you in touch if i find the solution.

20160314131339_1.jpg

20160314131357_1.jpg

Hey,

Actually i figured out what was the issue today, the port of the server is set in the launch script and as the command line setting are prior to file setting it was overriden and every servers were running on the port 10999 there was no issue with this as there was no port conflict inside each container.

The solution is to specify the port in environment variables, see my new docker-compose.yml (This times i'm running 3 servers with caves on the same host, search wapaca on the lobby to find them)

Spoiler

dst-social:
  image: jamesits/don-t-starve-together-dedicated-server:latest  
  ports:
    - 10999:10999/udp
    - 10888
  restart: always
  environment:
    - DST_PORT=10999
  volumes:
    - ./dst/social/master/:/data/DoNotStarveTogether
dst-social-slave:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10998:10998/udp
  restart: always
  environment:
    - DST_PORT=10998
  volumes:
    - ./dst/social/slave/:/data/DoNotStarveTogether
  links:
    - dst-social:masterip
dst-coop:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10991:10991/udp
    - 10887
  restart: always
  environment:
    - DST_PORT=10991
  volumes:
    - ./dst/coop/master/:/data/DoNotStarveTogether
dst-coop-slave:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10990:10990/udp
  restart: always
  environment:
    - DST_PORT=10990
  volumes:
    - ./dst/coop/slave/:/data/DoNotStarveTogether
  links:
    - dst-coop:masterip
dst-private:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10986:10986/udp
    - 10886
  restart: always
  environment:
    - DST_PORT=10986
  volumes:
    - ./dst/private/master/:/data/DoNotStarveTogether
dst-private-slave:
  image: jamesits/don-t-starve-together-dedicated-server:latest
  ports:
    - 10985:10985/udp
  restart: always
  environment:
    - DST_PORT=10985
  volumes:
    - ./dst/private/slave/:/data/DoNotStarveTogether
  links:
    - dst-private:masterip

This issue bring a major problem with the dst lobby, during my config i noticed that if i run two servers on the port 10999 but with only 1 of them binded to the host ip : 10999 the lobby doesn't check if the server registered really lead to the correct server, so if a user try to connect on the misconfigured server he will be connected to the server binded to 10999 on the ip. You can use this to trick the lobby and register an infinite amount of servers all connecting to 1 server, the worst part is that when i tried to connect to my coop server it was connecting me to the social one without issue so you can really mess up the lobby by registering your server in every gamemode+tag.

Archived

This topic is now archived and is closed to further replies.

Please be aware that the content of this thread may be outdated and no longer applicable.

×
  • Create New...