OCPBUGS-80939: Add robots.txt policy to console#16205
OCPBUGS-80939: Add robots.txt policy to console#16205logonoff wants to merge 1 commit intoopenshift:mainfrom
Conversation
|
@logonoff: This pull request references Jira Issue OCPBUGS-80939, which is invalid:
Comment The bug has been updated to refer to the pull request using the external bug tracker. DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
|
@logonoff: This pull request references Jira Issue OCPBUGS-80939, which is valid. The bug has been moved to the POST state. 3 validation(s) were run on this bug
DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
📝 WalkthroughWalkthroughThis change introduces web crawler directives by adding a ✨ Finishing Touches🧪 Generate unit tests (beta)
Comment |
There was a problem hiding this comment.
Actionable comments posted: 2
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@frontend/public/robots.txt`:
- Around line 1-2: The robots.txt stanza has a malformed User-agent directive
("User-Agent *" missing the colon) so crawlers may ignore the stanza; update the
directive to the correct syntax by changing the first line to "User-agent: *"
(ensure exact capitalization/spelling used in your repository's convention) so
it pairs with the existing "Disallow: /" line and correctly blocks all crawlers.
In `@pkg/server/server.go`:
- Around line 338-340: The route for robots.txt is currently registered via
handleFunc which prefixes it with s.BaseURL.Path and therefore only serves
${basePath}/robots.txt; instead register an origin-root route for "/robots.txt"
(in addition to the existing base-path one) so crawlers can fetch "/robots.txt".
Update the registration so that the root path "/robots.txt" is served directly
(e.g., use http.HandleFunc or the router instance that is not wrapped with
s.BaseURL.Path) and call http.ServeFile(w, r, path.Join(s.PublicDir,
"robots.txt")) in that handler; reference the existing handleFunc,
s.BaseURL.Path, s.PublicDir and robots.txt when making the change.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: Repository YAML (base), Organization UI (inherited)
Review profile: CHILL
Plan: Pro
Run ID: abbcfbe3-69a6-4fd5-a7b0-f767741cea44
📒 Files selected for processing (3)
frontend/public/robots.txtfrontend/webpack.config.tspkg/server/server.go
📜 Review details
🧰 Additional context used
📓 Path-based instructions (1)
**
⚙️ CodeRabbit configuration file
-Focus on major issues impacting performance, readability, maintainability and security. Avoid nitpicks and avoid verbosity.
Files:
pkg/server/server.gofrontend/public/robots.txtfrontend/webpack.config.ts
🔇 Additional comments (1)
frontend/webpack.config.ts (1)
293-300: Good cleanup of the locale copy rules.This keeps the
/static/locales/...layout intact and filters out non-JSON files, soOWNERSfiles stop leaking into the bundle.
2505928 to
e2d0980
Compare
also simplify locales copy/paste into a glob (which also prevents the locales OWNERS files from being bundled)
e2d0980 to
930da3d
Compare
|
[APPROVALNOTIFIER] This PR is APPROVED This pull-request has been approved by: jhadvig, logonoff The full list of commands accepted by this bot can be found here. The pull request process is described here DetailsNeeds approval from an approver in each of these files:
Approvers can indicate their approval by writing |
|
/assign @yapei |
|
no functional changes and did a very simple verification in local testing, believe it should be working fine after PR got merged /verified by @yapei |
|
@yapei: This PR has been marked as verified by DetailsIn response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the openshift-eng/jira-lifecycle-plugin repository. |
|
/test all |
1 similar comment
|
/test all |
|
@logonoff: The following test failed, say
Full PR test history. Your PR dashboard. DetailsInstructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here. |

Also simplify locales copy/paste into a glob (which also prevents the locales OWNERS files from being bundled)
Summary by CodeRabbit