[GH-ISSUE #3691] windows11 frpc 发送消息总提示“服务器被断开5秒后重连” #2940

Closed
opened 2026-05-05 13:53:52 -06:00 by gitea-mirror · 3 comments
Owner

Originally created by @wintli on GitHub (Oct 17, 2023).
Original GitHub issue: https://github.com/fatedier/frp/issues/3691

Bug Description

服务端部署了frps +tcp监听(自定义),客户端安装frpc, 同一个客户端向不同服务端发送相同指令,win10的服务端可以正确返回消息,但win11的服务端通过却接受不到(通过service断点调试没被触发),但如果不使用frp穿透,在局域网内访问该服务则正常。使用frp时,一旦连接就会自动收到一个固定的消息“RFB 005.000”(不是我自己定义的消息),并提示:The server is disconnected and a reconnection is scheduled in 5 seconds. 5秒后连接又是返回“RFB 005.000”,而我想要发送的指令始终没有到达win11里我的目标服务,好象是被拦截了。不知是否有人也碰到类似问题?

frpc Version

0.51.2/0.51.3

frps Version

0.51.3

System Architecture

linux/amd64

Configurations

frps.ini:
[common]
bind_addr = 0.0.0.0
bind_port = 10000
enable_prometheus = true
detailed_errors_to_client = true
authentication_method = token
authenticate_heartbeats = false
authenticate_new_work_conns = false
token = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
allow_ports = 10001-30000
max_pool_count = 5
max_ports_per_client = 0

frpc.ini:
[common]
server_addr = xxx.xxx.xxx.xxx
server_port = 10000
token = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx

[app_10011]
type = tcp
local_ip = 127.0.0.1
local_port = 10011
remote_port = 10011

[vnc_tcp_10012]
type = tcp
local_ip = 127.0.0.1
local_port = 5900
remote_port = 10012

[vnc_udp_10012]
type = udp
local_ip = 127.0.0.1
local_port = 5900
remote_port = 10012

Logs

2023/10/17 20:47:56 [I] [root.go:204] frps uses config file: /usr/local/frp/frps.ini
2023/10/17 20:47:56 [I] [service.go:206] frps tcp listen on 0.0.0.0:10000
2023/10/17 20:47:56 [I] [service.go:318] Dashboard listen on 0.0.0.0:9999
2023/10/17 20:47:56 [I] [root.go:213] frps started successfully
2023/10/17 20:47:56 [T] [service.go:461] start check TLS connection...
2023/10/17 20:47:56 [T] [service.go:470] check TLS connection success, isTLS: true custom: false
2023/10/17 20:47:56 [I] [service.go:539] [cdcc9054501e8e54] client login info: ip [113.65.252.76:56385] version [0.51.2] hostname [] os [windows] arch [amd64]
2023/10/17 20:47:56 [I] [tcp.go:81] [cdcc9054501e8e54] [app_10005] tcp proxy listen port [10005]
2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [app_10005] type [tcp] success
2023/10/17 20:47:56 [I] [tcp.go:81] [cdcc9054501e8e54] [vnc_tcp_10006] tcp proxy listen port [10006]
2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [vnc_tcp_10006] type [tcp] success
2023/10/17 20:47:56 [I] [udp.go:103] [cdcc9054501e8e54] [vnc_udp_10006] udp proxy listen port [10006]
2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [vnc_udp_10006] type [udp] success
2023/10/17 20:47:56 [D] [control.go:251] [cdcc9054501e8e54] new work connection registered
2023/10/17 20:47:57 [T] [service.go:461] start check TLS connection...
2023/10/17 20:47:57 [T] [service.go:470] check TLS connection success, isTLS: true custom: false
2023/10/17 20:47:57 [I] [service.go:539] [409648fbe2e53e48] client login info: ip [113.65.252.76:61857] version [0.51.3] hostname [] os [windows] arch [amd64]
2023/10/17 20:47:57 [D] [control.go:251] [409648fbe2e53e48] new work connection registered
2023/10/17 20:47:57 [I] [tcp.go:81] [409648fbe2e53e48] [app_10011] tcp proxy listen port [10011]
2023/10/17 20:47:57 [I] [control.go:497] [409648fbe2e53e48] new proxy [app_10011] type [tcp] success
2023/10/17 20:47:57 [D] [control.go:280] [cdcc9054501e8e54] get work connection from pool
2023/10/17 20:47:57 [D] [proxy.go:126] [cdcc9054501e8e54] [vnc_udp_10006] get a new work connection: [113.65.252.76:56385]
2023/10/17 20:47:57 [T] [udp.go:117] [cdcc9054501e8e54] [vnc_udp_10006] loop waiting message from udp workConn
2023/10/17 20:47:57 [D] [control.go:251] [cdcc9054501e8e54] new work connection registered
2023/10/17 20:48:01 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [113.65.252.76:55808]
2023/10/17 20:48:01 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool
2023/10/17 20:48:01 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857]
2023/10/17 20:48:01 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false
2023/10/17 20:48:01 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[113.65.252.76:55808])
2023/10/17 20:48:01 [D] [control.go:251] [409648fbe2e53e48] new work connection registered
2023/10/17 20:48:01 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [101.93.147.41:22025]
2023/10/17 20:48:01 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool
2023/10/17 20:48:01 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857]
2023/10/17 20:48:01 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false
2023/10/17 20:48:01 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[101.93.147.41:22025])
2023/10/17 20:48:01 [D] [control.go:251] [409648fbe2e53e48] new work connection registered
2023/10/17 20:48:26 [D] [control.go:532] [cdcc9054501e8e54] receive heartbeat
2023/10/17 20:48:27 [D] [control.go:532] [409648fbe2e53e48] receive heartbeat
2023/10/17 20:48:27 [T] [udp.go:135] [cdcc9054501e8e54] [vnc_udp_10006] udp work conn get ping message
2023/10/17 20:48:27 [T] [udp.go:117] [cdcc9054501e8e54] [vnc_udp_10006] loop waiting message from udp workConn
2023/10/17 20:48:50 [D] [proxy.go:265] [409648fbe2e53e48] [app_10011] join connections closed
2023/10/17 20:48:51 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [101.93.147.41:22043]
2023/10/17 20:48:51 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool
2023/10/17 20:48:51 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857]
2023/10/17 20:48:51 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false
2023/10/17 20:48:51 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[101.93.147.41:22043])
2023/10/17 20:48:51 [D] [control.go:251] [409648fbe2e53e48] new work connection registered
2023/10/17 20:48:56 [D] [control.go:532] [cdcc9054501e8e54] receive heartbeat
2023/10/17 20:48:57 [D] [control.go:532] [409648fbe2e53e48] receive heartbeat

Steps to reproduce

...

Affected area

  • Docs
  • Installation
  • Performance and Scalability
  • Security
  • User Experience
  • Test and Release
  • Developer Infrastructure
  • Client Plugin
  • Server Plugin
  • Extensions
  • Others
Originally created by @wintli on GitHub (Oct 17, 2023). Original GitHub issue: https://github.com/fatedier/frp/issues/3691 ### Bug Description 服务端部署了frps +tcp监听(自定义),客户端安装frpc, 同一个客户端向不同服务端发送相同指令,win10的服务端可以正确返回消息,但win11的服务端通过却接受不到(通过service断点调试没被触发),但如果不使用frp穿透,在局域网内访问该服务则正常。使用frp时,一旦连接就会自动收到一个固定的消息“RFB 005.000”(不是我自己定义的消息),并提示:The server is disconnected and a reconnection is scheduled in 5 seconds. 5秒后连接又是返回“RFB 005.000”,而我想要发送的指令始终没有到达win11里我的目标服务,好象是被拦截了。不知是否有人也碰到类似问题? ### frpc Version 0.51.2/0.51.3 ### frps Version 0.51.3 ### System Architecture linux/amd64 ### Configurations frps.ini: [common] bind_addr = 0.0.0.0 bind_port = 10000 enable_prometheus = true detailed_errors_to_client = true authentication_method = token authenticate_heartbeats = false authenticate_new_work_conns = false token = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx allow_ports = 10001-30000 max_pool_count = 5 max_ports_per_client = 0 frpc.ini: [common] server_addr = xxx.xxx.xxx.xxx server_port = 10000 token = xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx [app_10011] type = tcp local_ip = 127.0.0.1 local_port = 10011 remote_port = 10011 [vnc_tcp_10012] type = tcp local_ip = 127.0.0.1 local_port = 5900 remote_port = 10012 [vnc_udp_10012] type = udp local_ip = 127.0.0.1 local_port = 5900 remote_port = 10012 ### Logs 2023/10/17 20:47:56 [I] [root.go:204] frps uses config file: /usr/local/frp/frps.ini 2023/10/17 20:47:56 [I] [service.go:206] frps tcp listen on 0.0.0.0:10000 2023/10/17 20:47:56 [I] [service.go:318] Dashboard listen on 0.0.0.0:9999 2023/10/17 20:47:56 [I] [root.go:213] frps started successfully 2023/10/17 20:47:56 [T] [service.go:461] start check TLS connection... 2023/10/17 20:47:56 [T] [service.go:470] check TLS connection success, isTLS: true custom: false 2023/10/17 20:47:56 [I] [service.go:539] [cdcc9054501e8e54] client login info: ip [113.65.252.76:56385] version [0.51.2] hostname [] os [windows] arch [amd64] 2023/10/17 20:47:56 [I] [tcp.go:81] [cdcc9054501e8e54] [app_10005] tcp proxy listen port [10005] 2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [app_10005] type [tcp] success 2023/10/17 20:47:56 [I] [tcp.go:81] [cdcc9054501e8e54] [vnc_tcp_10006] tcp proxy listen port [10006] 2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [vnc_tcp_10006] type [tcp] success 2023/10/17 20:47:56 [I] [udp.go:103] [cdcc9054501e8e54] [vnc_udp_10006] udp proxy listen port [10006] 2023/10/17 20:47:56 [I] [control.go:497] [cdcc9054501e8e54] new proxy [vnc_udp_10006] type [udp] success 2023/10/17 20:47:56 [D] [control.go:251] [cdcc9054501e8e54] new work connection registered 2023/10/17 20:47:57 [T] [service.go:461] start check TLS connection... 2023/10/17 20:47:57 [T] [service.go:470] check TLS connection success, isTLS: true custom: false 2023/10/17 20:47:57 [I] [service.go:539] [409648fbe2e53e48] client login info: ip [113.65.252.76:61857] version [0.51.3] hostname [] os [windows] arch [amd64] 2023/10/17 20:47:57 [D] [control.go:251] [409648fbe2e53e48] new work connection registered 2023/10/17 20:47:57 [I] [tcp.go:81] [409648fbe2e53e48] [app_10011] tcp proxy listen port [10011] 2023/10/17 20:47:57 [I] [control.go:497] [409648fbe2e53e48] new proxy [app_10011] type [tcp] success 2023/10/17 20:47:57 [D] [control.go:280] [cdcc9054501e8e54] get work connection from pool 2023/10/17 20:47:57 [D] [proxy.go:126] [cdcc9054501e8e54] [vnc_udp_10006] get a new work connection: [113.65.252.76:56385] 2023/10/17 20:47:57 [T] [udp.go:117] [cdcc9054501e8e54] [vnc_udp_10006] loop waiting message from udp workConn 2023/10/17 20:47:57 [D] [control.go:251] [cdcc9054501e8e54] new work connection registered 2023/10/17 20:48:01 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [113.65.252.76:55808] 2023/10/17 20:48:01 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool 2023/10/17 20:48:01 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857] 2023/10/17 20:48:01 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false 2023/10/17 20:48:01 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[113.65.252.76:55808]) 2023/10/17 20:48:01 [D] [control.go:251] [409648fbe2e53e48] new work connection registered 2023/10/17 20:48:01 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [101.93.147.41:22025] 2023/10/17 20:48:01 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool 2023/10/17 20:48:01 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857] 2023/10/17 20:48:01 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false 2023/10/17 20:48:01 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[101.93.147.41:22025]) 2023/10/17 20:48:01 [D] [control.go:251] [409648fbe2e53e48] new work connection registered 2023/10/17 20:48:26 [D] [control.go:532] [cdcc9054501e8e54] receive heartbeat 2023/10/17 20:48:27 [D] [control.go:532] [409648fbe2e53e48] receive heartbeat 2023/10/17 20:48:27 [T] [udp.go:135] [cdcc9054501e8e54] [vnc_udp_10006] udp work conn get ping message 2023/10/17 20:48:27 [T] [udp.go:117] [cdcc9054501e8e54] [vnc_udp_10006] loop waiting message from udp workConn 2023/10/17 20:48:50 [D] [proxy.go:265] [409648fbe2e53e48] [app_10011] join connections closed 2023/10/17 20:48:51 [I] [proxy.go:199] [409648fbe2e53e48] [app_10011] get a user connection [101.93.147.41:22043] 2023/10/17 20:48:51 [D] [control.go:280] [409648fbe2e53e48] get work connection from pool 2023/10/17 20:48:51 [D] [proxy.go:126] [409648fbe2e53e48] [app_10011] get a new work connection: [113.65.252.76:61857] 2023/10/17 20:48:51 [T] [proxy.go:235] [409648fbe2e53e48] [app_10011] handler user tcp connection, use_encryption: false, use_compression: false 2023/10/17 20:48:51 [D] [proxy.go:255] [409648fbe2e53e48] [app_10011] join connections, workConn(l[172.22.131.127:10000] r[113.65.252.76:61857]) userConn(l[172.22.131.127:10011] r[101.93.147.41:22043]) 2023/10/17 20:48:51 [D] [control.go:251] [409648fbe2e53e48] new work connection registered 2023/10/17 20:48:56 [D] [control.go:532] [cdcc9054501e8e54] receive heartbeat 2023/10/17 20:48:57 [D] [control.go:532] [409648fbe2e53e48] receive heartbeat ### Steps to reproduce 1. 2. 3. ... ### Affected area - [ ] Docs - [ ] Installation - [ ] Performance and Scalability - [ ] Security - [ ] User Experience - [X] Test and Release - [ ] Developer Infrastructure - [ ] Client Plugin - [ ] Server Plugin - [ ] Extensions - [ ] Others
gitea-mirror 2026-05-05 13:53:52 -06:00
Author
Owner

@wintli commented on GitHub (Oct 17, 2023):

哈,配置有一项错了

<!-- gh-comment-id:1766667174 --> @wintli commented on GitHub (Oct 17, 2023): 哈,配置有一项错了
Author
Owner

@wensenz commented on GitHub (Oct 25, 2023):

所以是哪里出了问题?

<!-- gh-comment-id:1778442004 --> @wensenz commented on GitHub (Oct 25, 2023): 所以是哪里出了问题?
Author
Owner

@github-actions[bot] commented on GitHub (Nov 25, 2023):

Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.

<!-- gh-comment-id:1826167939 --> @github-actions[bot] commented on GitHub (Nov 25, 2023): Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: github-starred/frp#2940
No description provided.