mirror of
https://github.com/fatedier/frp.git
synced 2026-05-15 08:05:49 -06:00
[GH-ISSUE #3781] 是否支持范围端口代理,如40000-50000端口由同一个端口6000反向代理 #3009
Labels
No labels
In Progress
WIP
WaitingForInfo
bug
doc
duplicate
easy
enhancement
future
help wanted
invalid
lifecycle/stale
need-issue-template
need-usage-help
no plan
proposal
pull-request
question
todo
No milestone
No project
No assignees
1 participant
Notifications
Due date
No due date set.
Dependencies
No dependencies set.
Reference: github-starred/frp#3009
Loading…
Add table
Add a link
Reference in a new issue
No description provided.
Delete branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Originally created by @linshangqiang on GitHub (Nov 17, 2023).
Original GitHub issue: https://github.com/fatedier/frp/issues/3781
Describe the feature request
感谢大佬开源
咨询两个问题:
1、是否支持范围端口由同一个端口映射出去,如udp_port : 40000-50000端口映射到6000端口
2、http、udp端口能否使用同一个端口映射
想要达到的目的: 所有需要的端口,不管是http,范围udp端口,都走同一个端口代理,这样服务器只需要开放一个端口。或者http单独走一个端口代理,范围udp端口共同使用一个端口代理,这样服务器对外只需要开发2个端口
我的配置如下:
在10.10.0.24上启动了一个监听udp 40001端口的服务
使用另外一个设备,发送 echo "Hello from UDP client" | nc -u 182.254.156.93 6000
10.10.0.24未收到任何数据
如果我将local_port = 40000-50000改成local_port = 40000,udp服务器监听40000端口,另一台设备发送 echo "Hello from UDP client" | nc -u 182.254.156.93 6000 时,成功收到数据
想要范围端口能够走同一个端口代理,我该如何配置,如果不能,范围端口-范围代理端口 的这个方案,我该如何配置?
Describe alternatives you've considered
No response
Affected area
@linshangqiang commented on GitHub (Nov 17, 2023):
目前udp范围端口+范围映射已调通,配置如下:
这种方式需要开放的端口比较多,frp支持 n个udp端口由一个端口代理出去吗
@xqzr commented on GitHub (Nov 17, 2023):
526e809bd5/conf/frpc_full_example.toml (L254-L261)它可能不支持 UDP@linshangqiang commented on GitHub (Nov 19, 2023):
@xqzr commented on GitHub (Nov 19, 2023):
https://gofrp.org/zh-cn/docs/features/common/client-plugin/
@superzjg commented on GitHub (Nov 21, 2023):
https://github.com/fatedier/frp/issues/3711
@github-actions[bot] commented on GitHub (Dec 22, 2023):
Issues go stale after 30d of inactivity. Stale issues rot after an additional 7d of inactivity and eventually close.