[GH-ISSUE #3470] port already used 和 router config conflict #2774

Closed
opened 2026-05-05 13:47:18 -06:00 by gitea-mirror · 8 comments
Owner

Originally created by @amxliuli on GitHub (Jun 1, 2023).
Original GitHub issue: https://github.com/fatedier/frp/issues/3470

Bug Description

正常启动frp的客户端和服务端以后,穿透可以正常使用,但是使用一段时间(有时候几天,有时候几分钟)穿透就失败嘞,通过查看日志发现,是客户端会频繁(几分钟)的去login服务端,然后突然有一次就报 端口或者router冲突。

frpc Version

0.38.0

frps Version

0.38.0

System Architecture

linux/amd64

Configurations

  1. frps.ini
[common]
bind_port = 7000
tls_enable = true

#http https
vhost_http_port = 86
#vhost_https_port = 443
vhost_http_timeout = 6000

#log
log_file = /usr/local/software/frp/log/frps.log
log_level = trace

heartbeat_timeout = 300
user_conn_timeout = 60

#subdomain
subdomain_host = frp.***.com

max_pool_count = 200
  1. frpc.ini
[common]
server_addr = 182.61.53.***
server_port = 7000

# console or real logFile path like ./frpc.log
log_file = /usr/local/soft/frp/log/frpc.log

# trace, debug, info, warn, error
log_level = trace

log_max_days = 3

tls_enable=true

Logs

  1. frps.log
2023/06/01 14:32:18 [W] [control.go:312] [da8fd4774c1fc492] write message to control connection error: session shutdown
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [control.go:335] [da8fd4774c1fc492] control connection closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed
2023/06/01 14:32:19 [T] [service.go:396] start check TLS connection...
2023/06/01 14:32:19 [T] [service.go:404] success check TLS connection
2023/06/01 14:32:19 [I] [service.go:449] [3cc6c08b2114ce53] client login info: ip [219.145.34.148:1819] version [0.38.0] hostname [] os [linux] arch [amd64]
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [xndc_guoke_test_yanshi_front] error: port already used
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [98_registry_tcp] error: port already used
2023/06/01 14:32:19 [I] [proxy.go:88] [3cc6c08b2114ce53] [99_simulation_http] proxy closing
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [99_simulation_http] error: router config conflict
2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [suanfa_ssh_tcp] error: port already used
  1. frpc.log
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_mysql] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_huaneng] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front_ok] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [95_mysql8] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_ssh] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [zgh_zhny_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [sdprice_forecast_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_back] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [97_ssh_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_mysql8] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_keking] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_ssh_tcp] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_maven_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_gitlab] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_ok] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor] change status from [new] to [wait start]
2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [mantis_http] change status from [new] to [wait start]
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_test_yanshi_front] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_registry_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_simulation_http] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [suanfa_ssh_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [zj_prod_http] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [96_mysql5] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [fp_ocr_tcp] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [97_price] start error: router config conflict
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_ssh] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_back] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_mysql] start error: port already used
2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [sdprice_forecast_backup] start error: port already used

Steps to reproduce

  1. 启动服务端
  2. 启动客户端
  3. 使用一段时间后穿透失败
    ...

Affected area

  • Docs
  • Installation
  • Performance and Scalability
  • Security
  • User Experience
  • Test and Release
  • Developer Infrastructure
  • Client Plugin
  • Server Plugin
  • Extensions
  • Others
Originally created by @amxliuli on GitHub (Jun 1, 2023). Original GitHub issue: https://github.com/fatedier/frp/issues/3470 ### Bug Description 正常启动frp的客户端和服务端以后,穿透可以正常使用,但是使用一段时间(有时候几天,有时候几分钟)穿透就失败嘞,通过查看日志发现,是客户端会频繁(几分钟)的去login服务端,然后突然有一次就报 端口或者router冲突。 ### frpc Version 0.38.0 ### frps Version 0.38.0 ### System Architecture linux/amd64 ### Configurations 1. frps.ini ``` [common] bind_port = 7000 tls_enable = true #http https vhost_http_port = 86 #vhost_https_port = 443 vhost_http_timeout = 6000 #log log_file = /usr/local/software/frp/log/frps.log log_level = trace heartbeat_timeout = 300 user_conn_timeout = 60 #subdomain subdomain_host = frp.***.com max_pool_count = 200 ``` 2. frpc.ini ``` [common] server_addr = 182.61.53.*** server_port = 7000 # console or real logFile path like ./frpc.log log_file = /usr/local/soft/frp/log/frpc.log # trace, debug, info, warn, error log_level = trace log_max_days = 3 tls_enable=true ``` ### Logs 1. frps.log ``` 2023/06/01 14:32:18 [W] [control.go:312] [da8fd4774c1fc492] write message to control connection error: session shutdown 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [control.go:335] [da8fd4774c1fc492] control connection closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:18 [D] [proxy.go:300] [da8fd4774c1fc492] [minio_tcp] join connections closed 2023/06/01 14:32:19 [T] [service.go:396] start check TLS connection... 2023/06/01 14:32:19 [T] [service.go:404] success check TLS connection 2023/06/01 14:32:19 [I] [service.go:449] [3cc6c08b2114ce53] client login info: ip [219.145.34.148:1819] version [0.38.0] hostname [] os [linux] arch [amd64] 2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [xndc_guoke_test_yanshi_front] error: port already used 2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [98_registry_tcp] error: port already used 2023/06/01 14:32:19 [I] [proxy.go:88] [3cc6c08b2114ce53] [99_simulation_http] proxy closing 2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [99_simulation_http] error: router config conflict 2023/06/01 14:32:19 [W] [control.go:440] [3cc6c08b2114ce53] new proxy [suanfa_ssh_tcp] error: port already used ``` 2. frpc.log ``` 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_mysql] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [98_huaneng] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_tcp] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front_ok] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [95_mysql8] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor_tcp] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_ssh] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [zgh_zhny_http] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [sdprice_forecast_http] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_back] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [97_ssh_tcp] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_mysql8] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_keking] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [96_ssh_tcp] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [minio_http] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_maven_http] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_gitlab] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_ok] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [99_harbor] change status from [new] to [wait start] 2023/06/01 14:32:19 [T] [proxy_wrapper.go:171] [3cc6c08b2114ce53] [mantis_http] change status from [new] to [wait start] 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_test_yanshi_front] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_registry_tcp] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_simulation_http] start error: router config conflict 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [suanfa_ssh_tcp] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [zj_prod_http] start error: router config conflict 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [96_mysql5] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [fp_ocr_tcp] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [97_price] start error: router config conflict 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [98_ssh] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_dev_front] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [xndc_guoke_tcp_yanshi_back] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [99_mysql] start error: port already used 2023/06/01 14:32:19 [W] [control.go:178] [3cc6c08b2114ce53] [sdprice_forecast_backup] start error: port already used ``` ### Steps to reproduce 1. 启动服务端 2. 启动客户端 3. 使用一段时间后穿透失败 ... ### Affected area - [ ] Docs - [ ] Installation - [ ] Performance and Scalability - [ ] Security - [ ] User Experience - [ ] Test and Release - [ ] Developer Infrastructure - [ ] Client Plugin - [ ] Server Plugin - [ ] Extensions - [ ] Others
gitea-mirror 2026-05-05 13:47:18 -06:00
Author
Owner

@Becods commented on GitHub (Jun 1, 2023):

更新你的frp
使用mtr进行连续ping测试

<!-- gh-comment-id:1572109204 --> @Becods commented on GitHub (Jun 1, 2023): 更新你的frp 使用mtr进行连续ping测试
Author
Owner

@amxliuli commented on GitHub (Jun 2, 2023):

由于目前穿透应用于生产环境,版本暂时不能立马升级,只做了mtr测试,结果如下:

  1. c->s
    c-s
  2. s->c
    s- c
<!-- gh-comment-id:1573022831 --> @amxliuli commented on GitHub (Jun 2, 2023): 由于目前穿透应用于生产环境,版本暂时不能立马升级,只做了mtr测试,结果如下: 1. c->s ![c-s](https://github.com/fatedier/frp/assets/25027057/5f3b4afe-2f0a-4f00-bc3f-283b48ea26c6) 2. s->c ![s- c](https://github.com/fatedier/frp/assets/25027057/3a70e568-d0f3-4997-acee-e09e1fd042f8)
Author
Owner

@amxliuli commented on GitHub (Jun 2, 2023):

今天断开报错是这个
frperror

<!-- gh-comment-id:1573103321 --> @amxliuli commented on GitHub (Jun 2, 2023): 今天断开报错是这个 ![frperror](https://github.com/fatedier/frp/assets/25027057/549a34b4-af9f-466b-8f62-34f4c49417a5)
Author
Owner

@fatedier commented on GitHub (Jun 2, 2023):

断开大部分都是网络原因,正常情况下,网络恢复后,相同 RunID 的客户端连接上来时,会释放之前的资源,不会出现 port alrewady used。

如果你的 frpc 重启了,之前的 RunID 就丢失了,相当于一个新的客户端。frps 必须等到和之前的客户端心跳超时后,才会释放相关资源。通过修改心跳相关的配置可以降低这个等待的时间。

如果没有重启 frpc,RunID 出现了变化,那可能是 bug。否则,只需要等网络恢复后自动重连。

<!-- gh-comment-id:1573111322 --> @fatedier commented on GitHub (Jun 2, 2023): 断开大部分都是网络原因,正常情况下,网络恢复后,相同 RunID 的客户端连接上来时,会释放之前的资源,不会出现 port alrewady used。 如果你的 frpc 重启了,之前的 RunID 就丢失了,相当于一个新的客户端。frps 必须等到和之前的客户端心跳超时后,才会释放相关资源。通过修改心跳相关的配置可以降低这个等待的时间。 如果没有重启 frpc,RunID 出现了变化,那可能是 bug。否则,只需要等网络恢复后自动重连。
Author
Owner

@amxliuli commented on GitHub (Jun 2, 2023):

heartbeat_timeout = 300
user_conn_timeout = 60
那是不是我把这两个时间需要设置小一点才可以,当网络出现问题的时候,重新连接时候如果时间设置的过程,会导致原来的链接在s端还没有释放,新的连接连接的时候,发现原来的端口有链接,才会报这个问题。

<!-- gh-comment-id:1573497103 --> @amxliuli commented on GitHub (Jun 2, 2023): heartbeat_timeout = 300 user_conn_timeout = 60 那是不是我把这两个时间需要设置小一点才可以,当网络出现问题的时候,重新连接时候如果时间设置的过程,会导致原来的链接在s端还没有释放,新的连接连接的时候,发现原来的端口有链接,才会报这个问题。
Author
Owner

@yzlnew commented on GitHub (Jun 26, 2023):

0.44.0 遇到类似的问题,需要服务端和客户端都重启。

<!-- gh-comment-id:1607818190 --> @yzlnew commented on GitHub (Jun 26, 2023): 0.44.0 遇到类似的问题,需要服务端和客户端都重启。
Author
Owner

@github-actions[bot] commented on GitHub (Jul 27, 2023):

Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.

<!-- gh-comment-id:1652724268 --> @github-actions[bot] commented on GitHub (Jul 27, 2023): Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.
Author
Owner

@swzaaaaaaa commented on GitHub (Jan 25, 2024):

请问,有解决方法吗

<!-- gh-comment-id:1909191053 --> @swzaaaaaaa commented on GitHub (Jan 25, 2024): 请问,有解决方法吗
Sign in to join this conversation.
No milestone
No project
No assignees
1 participant
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: github-starred/frp#2774
No description provided.