2019-05-07 03:12:51 +02:00
// Copyright 2019 The Gitea Authors. All rights reserved.
// Copyright 2018 Jonas Franz. All rights reserved.
// Use of this source code is governed by a MIT-style
// license that can be found in the LICENSE file.
package migrations
import (
2019-12-17 05:16:54 +01:00
"context"
2019-05-07 03:12:51 +02:00
"fmt"
"io"
"os"
2022-09-04 15:41:21 +02:00
"path"
2019-05-07 03:12:51 +02:00
"path/filepath"
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 18:08:35 +01:00
"strconv"
2019-05-07 03:12:51 +02:00
"strings"
"time"
"code.gitea.io/gitea/models"
2021-09-19 13:49:59 +02:00
"code.gitea.io/gitea/models/db"
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 18:08:35 +01:00
"code.gitea.io/gitea/models/foreignreference"
2022-03-31 11:20:39 +02:00
issues_model "code.gitea.io/gitea/models/issues"
2021-11-19 14:39:57 +01:00
repo_model "code.gitea.io/gitea/models/repo"
2021-11-24 10:49:20 +01:00
user_model "code.gitea.io/gitea/models/user"
2019-05-07 03:12:51 +02:00
"code.gitea.io/gitea/modules/git"
"code.gitea.io/gitea/modules/log"
2021-11-16 16:25:33 +01:00
base "code.gitea.io/gitea/modules/migration"
2020-01-12 13:11:17 +01:00
repo_module "code.gitea.io/gitea/modules/repository"
2019-05-07 03:12:51 +02:00
"code.gitea.io/gitea/modules/setting"
2020-08-18 06:23:45 +02:00
"code.gitea.io/gitea/modules/storage"
2019-10-13 15:23:14 +02:00
"code.gitea.io/gitea/modules/structs"
2019-08-15 16:46:21 +02:00
"code.gitea.io/gitea/modules/timeutil"
2020-12-27 04:34:19 +01:00
"code.gitea.io/gitea/modules/uri"
2020-10-27 22:34:56 +01:00
"code.gitea.io/gitea/services/pull"
2019-05-07 03:12:51 +02:00
2022-09-04 15:41:21 +02:00
"github.com/google/uuid"
2019-05-07 03:12:51 +02:00
)
2022-01-20 18:46:10 +01:00
var _ base . Uploader = & GiteaLocalUploader { }
2019-05-07 03:12:51 +02:00
// GiteaLocalUploader implements an Uploader to gitea sites
type GiteaLocalUploader struct {
2019-12-17 05:16:54 +01:00
ctx context . Context
2021-11-24 10:49:20 +01:00
doer * user_model . User
2019-10-14 08:10:42 +02:00
repoOwner string
repoName string
2021-12-10 02:27:50 +01:00
repo * repo_model . Repository
2022-06-13 11:37:59 +02:00
labels map [ string ] * issues_model . Label
2022-02-03 20:18:18 +01:00
milestones map [ string ] int64
2022-06-13 11:37:59 +02:00
issues map [ int64 ] * issues_model . Issue
2019-10-14 08:10:42 +02:00
gitRepo * git . Repository
2022-09-04 15:41:21 +02:00
prHeadCache map [ string ] string
2022-02-06 10:05:29 +01:00
sameApp bool
2019-10-14 08:10:42 +02:00
userMap map [ int64 ] int64 // external user id mapping to user id
2022-06-13 11:37:59 +02:00
prCache map [ int64 ] * issues_model . PullRequest
2019-10-14 08:10:42 +02:00
gitServiceType structs . GitServiceType
2019-05-07 03:12:51 +02:00
}
// NewGiteaLocalUploader creates an gitea Uploader via gitea API v1
2021-11-24 10:49:20 +01:00
func NewGiteaLocalUploader ( ctx context . Context , doer * user_model . User , repoOwner , repoName string ) * GiteaLocalUploader {
2019-05-07 03:12:51 +02:00
return & GiteaLocalUploader {
2019-12-17 05:16:54 +01:00
ctx : ctx ,
2019-05-07 03:12:51 +02:00
doer : doer ,
repoOwner : repoOwner ,
repoName : repoName ,
2022-06-13 11:37:59 +02:00
labels : make ( map [ string ] * issues_model . Label ) ,
2022-02-03 20:18:18 +01:00
milestones : make ( map [ string ] int64 ) ,
2022-06-13 11:37:59 +02:00
issues : make ( map [ int64 ] * issues_model . Issue ) ,
2022-09-04 15:41:21 +02:00
prHeadCache : make ( map [ string ] string ) ,
2019-10-14 08:10:42 +02:00
userMap : make ( map [ int64 ] int64 ) ,
2022-06-13 11:37:59 +02:00
prCache : make ( map [ int64 ] * issues_model . PullRequest ) ,
2019-05-07 03:12:51 +02:00
}
}
2019-07-06 21:24:50 +02:00
// MaxBatchInsertSize returns the table's max batch insert size
func ( g * GiteaLocalUploader ) MaxBatchInsertSize ( tp string ) int {
switch tp {
case "issue" :
2022-06-13 11:37:59 +02:00
return db . MaxBatchInsertSize ( new ( issues_model . Issue ) )
2019-07-06 21:24:50 +02:00
case "comment" :
2022-06-13 11:37:59 +02:00
return db . MaxBatchInsertSize ( new ( issues_model . Comment ) )
2019-07-06 21:24:50 +02:00
case "milestone" :
2022-04-08 11:11:15 +02:00
return db . MaxBatchInsertSize ( new ( issues_model . Milestone ) )
2019-07-06 21:24:50 +02:00
case "label" :
2022-06-13 11:37:59 +02:00
return db . MaxBatchInsertSize ( new ( issues_model . Label ) )
2019-07-06 21:24:50 +02:00
case "release" :
2021-09-19 13:49:59 +02:00
return db . MaxBatchInsertSize ( new ( models . Release ) )
2019-07-06 21:24:50 +02:00
case "pullrequest" :
2022-06-13 11:37:59 +02:00
return db . MaxBatchInsertSize ( new ( issues_model . PullRequest ) )
2019-07-06 21:24:50 +02:00
}
return 10
}
2020-12-27 04:34:19 +01:00
// CreateRepo creates a repository
func ( g * GiteaLocalUploader ) CreateRepo ( repo * base . Repository , opts base . MigrateOptions ) error {
2022-05-20 16:08:52 +02:00
owner , err := user_model . GetUserByName ( g . ctx , g . repoOwner )
2020-12-27 04:34:19 +01:00
if err != nil {
return err
2019-08-20 22:21:07 +02:00
}
2021-12-10 02:27:50 +01:00
var r * repo_model . Repository
2019-10-13 15:23:14 +02:00
if opts . MigrateToRepoID <= 0 {
2020-01-12 13:11:17 +01:00
r , err = repo_module . CreateRepository ( g . doer , owner , models . CreateRepoOptions {
2020-01-10 16:35:17 +01:00
Name : g . repoName ,
Description : repo . Description ,
OriginalURL : repo . OriginalURL ,
GitServiceType : opts . GitServiceType ,
IsPrivate : opts . Private ,
IsMirror : opts . Mirror ,
2021-12-10 02:27:50 +01:00
Status : repo_model . RepositoryBeingMigrated ,
2019-10-13 15:23:14 +02:00
} )
} else {
2021-12-10 02:27:50 +01:00
r , err = repo_model . GetRepositoryByID ( opts . MigrateToRepoID )
2019-10-13 15:23:14 +02:00
}
if err != nil {
return err
}
2020-09-15 16:37:44 +02:00
r . DefaultBranch = repo . DefaultBranch
2021-11-18 16:28:10 +01:00
r . Description = repo . Description
2019-10-13 15:23:14 +02:00
2021-11-08 08:04:13 +01:00
r , err = repo_module . MigrateRepositoryGitData ( g . ctx , owner , r , base . MigrateOptions {
2019-10-14 08:10:42 +02:00
RepoName : g . repoName ,
Description : repo . Description ,
OriginalURL : repo . OriginalURL ,
GitServiceType : opts . GitServiceType ,
Mirror : repo . IsMirror ,
2021-04-09 00:25:57 +02:00
LFS : opts . LFS ,
LFSEndpoint : opts . LFSEndpoint ,
2022-09-04 15:41:21 +02:00
CloneAddr : repo . CloneURL , // SECURITY: we will assume that this has already been checked
2019-10-14 08:10:42 +02:00
Private : repo . IsPrivate ,
Wiki : opts . Wiki ,
Releases : opts . Releases , // if didn't get releases, then sync them from tags
2021-01-03 00:47:47 +01:00
MirrorInterval : opts . MirrorInterval ,
2021-11-20 10:34:05 +01:00
} , NewMigrationHTTPTransport ( ) )
2019-10-13 15:23:14 +02:00
2022-02-06 10:05:29 +01:00
g . sameApp = strings . HasPrefix ( repo . OriginalURL , setting . AppURL )
2019-05-25 23:18:27 +02:00
g . repo = r
2019-05-07 03:12:51 +02:00
if err != nil {
return err
}
2022-03-29 21:13:41 +02:00
g . gitRepo , err = git . OpenRepository ( g . ctx , r . RepoPath ( ) )
2019-05-07 03:12:51 +02:00
return err
}
2019-11-13 08:01:19 +01:00
// Close closes this uploader
func ( g * GiteaLocalUploader ) Close ( ) {
if g . gitRepo != nil {
g . gitRepo . Close ( )
}
}
2019-08-14 08:16:12 +02:00
// CreateTopics creates topics
func ( g * GiteaLocalUploader ) CreateTopics ( topics ... string ) error {
2022-09-04 15:41:21 +02:00
// Ignore topics too long for the db
2020-12-27 02:23:57 +01:00
c := 0
2022-09-04 15:41:21 +02:00
for _ , topic := range topics {
if len ( topic ) > 50 {
continue
2020-12-27 02:23:57 +01:00
}
2022-09-04 15:41:21 +02:00
topics [ c ] = topic
c ++
2020-12-27 02:23:57 +01:00
}
topics = topics [ : c ]
2021-12-12 16:48:20 +01:00
return repo_model . SaveTopics ( g . repo . ID , topics ... )
2019-08-14 08:16:12 +02:00
}
2019-06-29 15:38:22 +02:00
// CreateMilestones creates milestones
func ( g * GiteaLocalUploader ) CreateMilestones ( milestones ... * base . Milestone ) error {
2022-04-08 11:11:15 +02:00
mss := make ( [ ] * issues_model . Milestone , 0 , len ( milestones ) )
2019-06-29 15:38:22 +02:00
for _ , milestone := range milestones {
2019-08-15 16:46:21 +02:00
var deadline timeutil . TimeStamp
2019-06-29 15:38:22 +02:00
if milestone . Deadline != nil {
2019-08-15 16:46:21 +02:00
deadline = timeutil . TimeStamp ( milestone . Deadline . Unix ( ) )
2019-06-29 15:38:22 +02:00
}
if deadline == 0 {
2019-08-15 16:46:21 +02:00
deadline = timeutil . TimeStamp ( time . Date ( 9999 , 1 , 1 , 0 , 0 , 0 , 0 , setting . DefaultUILocation ) . Unix ( ) )
2019-06-29 15:38:22 +02:00
}
2021-08-18 02:47:18 +02:00
if milestone . Created . IsZero ( ) {
if milestone . Updated != nil {
milestone . Created = * milestone . Updated
} else if milestone . Deadline != nil {
milestone . Created = * milestone . Deadline
} else {
milestone . Created = time . Now ( )
}
}
if milestone . Updated == nil || milestone . Updated . IsZero ( ) {
milestone . Updated = & milestone . Created
}
2022-04-08 11:11:15 +02:00
ms := issues_model . Milestone {
2019-06-29 15:38:22 +02:00
RepoID : g . repo . ID ,
Name : milestone . Title ,
Content : milestone . Description ,
2019-07-29 17:41:22 +02:00
IsClosed : milestone . State == "closed" ,
2021-08-18 02:47:18 +02:00
CreatedUnix : timeutil . TimeStamp ( milestone . Created . Unix ( ) ) ,
UpdatedUnix : timeutil . TimeStamp ( milestone . Updated . Unix ( ) ) ,
2019-06-29 15:38:22 +02:00
DeadlineUnix : deadline ,
}
if ms . IsClosed && milestone . Closed != nil {
2019-08-15 16:46:21 +02:00
ms . ClosedDateUnix = timeutil . TimeStamp ( milestone . Closed . Unix ( ) )
2019-06-29 15:38:22 +02:00
}
mss = append ( mss , & ms )
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
err := models . InsertMilestones ( mss ... )
2019-05-07 03:12:51 +02:00
if err != nil {
return err
}
2019-06-29 15:38:22 +02:00
for _ , ms := range mss {
2022-02-03 20:18:18 +01:00
g . milestones [ ms . Name ] = ms . ID
2019-05-07 03:12:51 +02:00
}
return nil
}
2019-06-29 15:38:22 +02:00
// CreateLabels creates labels
func ( g * GiteaLocalUploader ) CreateLabels ( labels ... * base . Label ) error {
2022-06-13 11:37:59 +02:00
lbs := make ( [ ] * issues_model . Label , 0 , len ( labels ) )
2019-06-29 15:38:22 +02:00
for _ , label := range labels {
2022-09-04 15:41:21 +02:00
// We must validate color here:
if ! issues_model . LabelColorPattern . MatchString ( "#" + label . Color ) {
log . Warn ( "Invalid label color: #%s for label: %s in migration to %s/%s" , label . Color , label . Name , g . repoOwner , g . repoName )
label . Color = "ffffff"
}
2022-06-13 11:37:59 +02:00
lbs = append ( lbs , & issues_model . Label {
2019-06-29 15:38:22 +02:00
RepoID : g . repo . ID ,
Name : label . Name ,
Description : label . Description ,
2022-09-04 15:41:21 +02:00
Color : "#" + label . Color ,
2019-06-29 15:38:22 +02:00
} )
2019-05-07 03:12:51 +02:00
}
2022-06-13 11:37:59 +02:00
err := issues_model . NewLabels ( lbs ... )
2019-05-07 03:12:51 +02:00
if err != nil {
2019-06-29 15:38:22 +02:00
return err
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
for _ , lb := range lbs {
2022-02-03 20:18:18 +01:00
g . labels [ lb . Name ] = lb
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
return nil
}
2019-05-07 03:12:51 +02:00
2019-06-29 15:38:22 +02:00
// CreateReleases creates releases
2020-12-27 04:34:19 +01:00
func ( g * GiteaLocalUploader ) CreateReleases ( releases ... * base . Release ) error {
2022-01-20 18:46:10 +01:00
rels := make ( [ ] * models . Release , 0 , len ( releases ) )
2019-06-29 15:38:22 +02:00
for _ , release := range releases {
2021-08-18 02:47:18 +02:00
if release . Created . IsZero ( ) {
if ! release . Published . IsZero ( ) {
release . Created = release . Published
} else {
release . Created = time . Now ( )
}
}
2022-09-04 15:41:21 +02:00
// SECURITY: The TagName must be a valid git ref
if release . TagName != "" && ! git . IsValidRefPattern ( release . TagName ) {
release . TagName = ""
}
// SECURITY: The TargetCommitish must be a valid git ref
if release . TargetCommitish != "" && ! git . IsValidRefPattern ( release . TargetCommitish ) {
release . TargetCommitish = ""
}
2022-01-20 18:46:10 +01:00
rel := models . Release {
2019-10-14 08:10:42 +02:00
RepoID : g . repo . ID ,
TagName : release . TagName ,
LowerTagName : strings . ToLower ( release . TagName ) ,
Target : release . TargetCommitish ,
Title : release . Name ,
Note : release . Body ,
IsDraft : release . Draft ,
IsPrerelease : release . Prerelease ,
IsTag : false ,
CreatedUnix : timeutil . TimeStamp ( release . Created . Unix ( ) ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( release , & rel ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2019-05-07 03:12:51 +02:00
}
2022-04-27 01:24:06 +02:00
// calc NumCommits if possible
if rel . TagName != "" {
2021-05-29 22:04:58 +02:00
commit , err := g . gitRepo . GetTagCommit ( rel . TagName )
2022-07-12 20:55:25 +02:00
if ! git . IsErrNotExist ( err ) {
2022-04-27 01:24:06 +02:00
if err != nil {
return fmt . Errorf ( "GetTagCommit[%v]: %v" , rel . TagName , err )
}
rel . Sha1 = commit . ID . String ( )
rel . NumCommits , err = commit . CommitsCount ( )
if err != nil {
return fmt . Errorf ( "CommitsCount: %v" , err )
}
2021-05-16 00:37:17 +02:00
}
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
for _ , asset := range release . Assets {
2021-08-18 02:47:18 +02:00
if asset . Created . IsZero ( ) {
if ! asset . Updated . IsZero ( ) {
asset . Created = asset . Updated
} else {
asset . Created = release . Created
}
}
2022-01-20 18:46:10 +01:00
attach := repo_model . Attachment {
2022-09-04 15:41:21 +02:00
UUID : uuid . New ( ) . String ( ) ,
2019-06-29 15:38:22 +02:00
Name : asset . Name ,
DownloadCount : int64 ( * asset . DownloadCount ) ,
Size : int64 ( * asset . Size ) ,
2019-08-15 16:46:21 +02:00
CreatedUnix : timeutil . TimeStamp ( asset . Created . Unix ( ) ) ,
2019-06-29 15:38:22 +02:00
}
2022-09-04 15:41:21 +02:00
// SECURITY: We cannot check the DownloadURL and DownloadFunc are safe here
// ... we must assume that they are safe and simply download the attachment
2021-05-16 00:37:17 +02:00
err := func ( ) error {
2020-12-27 04:34:19 +01:00
// asset.DownloadURL maybe a local file
2020-10-14 06:06:00 +02:00
var rc io . ReadCloser
2021-05-16 00:37:17 +02:00
var err error
2021-06-04 15:14:20 +02:00
if asset . DownloadFunc != nil {
2020-12-27 04:34:19 +01:00
rc , err = asset . DownloadFunc ( )
2020-10-14 06:06:00 +02:00
if err != nil {
return err
}
2021-06-04 15:14:20 +02:00
} else if asset . DownloadURL != nil {
2020-12-27 04:34:19 +01:00
rc , err = uri . Open ( * asset . DownloadURL )
2020-10-14 06:06:00 +02:00
if err != nil {
return err
}
2019-11-05 13:54:47 +01:00
}
2021-06-04 15:14:20 +02:00
if rc == nil {
return nil
}
2021-04-03 18:19:59 +02:00
_ , err = storage . Attachments . Save ( attach . RelativePath ( ) , rc , int64 ( * asset . Size ) )
2021-06-04 15:14:20 +02:00
rc . Close ( )
2019-11-05 13:54:47 +01:00
return err
} ( )
if err != nil {
2019-06-29 15:38:22 +02:00
return err
}
2020-12-27 04:34:19 +01:00
2019-06-29 15:38:22 +02:00
rel . Attachments = append ( rel . Attachments , & attach )
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
rels = append ( rels , & rel )
2019-05-07 03:12:51 +02:00
}
2019-07-01 23:17:16 +02:00
2019-12-12 01:20:11 +01:00
return models . InsertReleases ( rels ... )
}
2019-12-11 07:09:06 +01:00
2019-12-12 01:20:11 +01:00
// SyncTags syncs releases with tags in the database
func ( g * GiteaLocalUploader ) SyncTags ( ) error {
2021-11-08 08:04:13 +01:00
return repo_module . SyncReleasesWithTags ( g . repo , g . gitRepo )
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
// CreateIssues creates issues
func ( g * GiteaLocalUploader ) CreateIssues ( issues ... * base . Issue ) error {
2022-06-13 11:37:59 +02:00
iss := make ( [ ] * issues_model . Issue , 0 , len ( issues ) )
2019-06-29 15:38:22 +02:00
for _ , issue := range issues {
2022-06-13 11:37:59 +02:00
var labels [ ] * issues_model . Label
2019-06-29 15:38:22 +02:00
for _ , label := range issue . Labels {
2022-02-03 20:18:18 +01:00
lb , ok := g . labels [ label . Name ]
2019-06-29 15:38:22 +02:00
if ok {
2022-02-03 20:18:18 +01:00
labels = append ( labels , lb )
2019-06-29 15:38:22 +02:00
}
2019-05-07 03:12:51 +02:00
}
2022-02-03 20:18:18 +01:00
milestoneID := g . milestones [ issue . Milestone ]
2019-05-07 03:12:51 +02:00
2021-08-18 02:47:18 +02:00
if issue . Created . IsZero ( ) {
if issue . Closed != nil {
issue . Created = * issue . Closed
} else {
issue . Created = time . Now ( )
}
}
if issue . Updated . IsZero ( ) {
if issue . Closed != nil {
issue . Updated = * issue . Closed
} else {
issue . Updated = time . Now ( )
}
}
2022-09-04 15:41:21 +02:00
// SECURITY: issue.Ref needs to be a valid reference
if ! git . IsValidRefPattern ( issue . Ref ) {
log . Warn ( "Invalid issue.Ref[%s] in issue #%d in %s/%s" , issue . Ref , issue . Number , g . repoOwner , g . repoName )
issue . Ref = ""
}
2022-06-13 11:37:59 +02:00
is := issues_model . Issue {
2019-10-14 08:10:42 +02:00
RepoID : g . repo . ID ,
Repo : g . repo ,
Index : issue . Number ,
Title : issue . Title ,
Content : issue . Content ,
2020-12-13 12:34:11 +01:00
Ref : issue . Ref ,
2019-10-14 08:10:42 +02:00
IsClosed : issue . State == "closed" ,
IsLocked : issue . IsLocked ,
MilestoneID : milestoneID ,
Labels : labels ,
CreatedUnix : timeutil . TimeStamp ( issue . Created . Unix ( ) ) ,
2020-01-14 11:29:22 +01:00
UpdatedUnix : timeutil . TimeStamp ( issue . Updated . Unix ( ) ) ,
Store the foreign ID of issues during migration (#18446)
Storing the foreign identifier of an imported issue in the database is a prerequisite to implement idempotent migrations or mirror for issues. It is a baby step towards mirroring that introduces a new table.
At the moment when an issue is created by the Gitea uploader, it fails if the issue already exists. The Gitea uploader could be modified so that, instead of failing, it looks up the database to find an existing issue. And if it does it would update the issue instead of creating a new one. However this is not currently possible because an information is missing from the database: the foreign identifier that uniquely represents the issue being migrated is not persisted. With this change, the foreign identifier is stored in the database and the Gitea uploader will then be able to run a query to figure out if a given issue being imported already exists.
The implementation of mirroring for issues, pull requests, releases, etc. can be done in three steps:
1. Store an identifier for the element being mirrored (issue, pull request...) in the database (this is the purpose of these changes)
2. Modify the Gitea uploader to be able to update an existing repository with all it contains (issues, pull request...) instead of failing if it exists
3. Optimize the Gitea uploader to speed up the updates, when possible.
The second step creates code that does not yet exist to enable idempotent migrations with the Gitea uploader. When a migration is done for the first time, the behavior is not changed. But when a migration is done for a repository that already exists, this new code is used to update it.
The third step can use the code created in the second step to optimize and speed up migrations. For instance, when a migration is resumed, an issue that has an update time that is not more recent can be skipped and only newly created issues or updated ones will be updated. Another example of optimization could be that a webhook notifies Gitea when an issue is updated. The code triggered by the webhook would download only this issue and call the code created in the second step to update the issue, as if it was in the process of an idempotent migration.
The ForeignReferences table is added to contain local and foreign ID pairs relative to a given repository. It can later be used for pull requests and other artifacts that can be mirrored. Although the foreign id could be added as a single field in issues or pull requests, it would need to be added to all tables that represent something that can be mirrored. Creating a new table makes for a simpler and more generic design. The drawback is that it requires an extra lookup to obtain the information. However, this extra information is only required during migration or mirroring and does not impact the way Gitea currently works.
The foreign identifier of an issue or pull request is similar to the identifier of an external user, which is stored in reactions, issues, etc. as OriginalPosterID and so on. The representation of a user is however different and the ability of users to link their account to an external user at a later time is also a logic that is different from what is involved in mirroring or migrations. For these reasons, despite some commonalities, it is unclear at this time how the two tables (foreign reference and external user) could be merged together.
The ForeignID field is extracted from the issue migration context so that it can be dumped in files with dump-repo and later restored via restore-repo.
The GetAllComments downloader method is introduced to simplify the implementation and not overload the Context for the purpose of pagination. It also clarifies in which context the comments are paginated and in which context they are not.
The Context interface is no longer useful for the purpose of retrieving the LocalID and ForeignID since they are now both available from the PullRequest and Issue struct. The Reviewable and Commentable interfaces replace and serve the same purpose.
The Context data member of PullRequest and Issue becomes a DownloaderContext to clarify that its purpose is not to support in memory operations while the current downloader is acting but is not otherwise persisted. It is, for instance, used by the GitLab downloader to store the IsMergeRequest boolean and sort out issues.
---
[source](https://lab.forgefriends.org/forgefriends/forgefriends/-/merge_requests/36)
Signed-off-by: Loïc Dachary <loic@dachary.org>
Co-authored-by: Loïc Dachary <loic@dachary.org>
2022-03-17 18:08:35 +01:00
ForeignReference : & foreignreference . ForeignReference {
LocalIndex : issue . GetLocalIndex ( ) ,
ForeignIndex : strconv . FormatInt ( issue . GetForeignIndex ( ) , 10 ) ,
RepoID : g . repo . ID ,
Type : foreignreference . TypeIssue ,
} ,
2019-10-14 08:10:42 +02:00
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( issue , & is ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2019-06-29 15:38:22 +02:00
}
2019-10-14 08:10:42 +02:00
2019-06-29 15:38:22 +02:00
if issue . Closed != nil {
2019-08-15 16:46:21 +02:00
is . ClosedUnix = timeutil . TimeStamp ( issue . Closed . Unix ( ) )
2019-06-29 15:38:22 +02:00
}
2020-01-15 12:14:07 +01:00
// add reactions
for _ , reaction := range issue . Reactions {
2022-03-31 11:20:39 +02:00
res := issues_model . Reaction {
2020-01-15 12:14:07 +01:00
Type : reaction . Content ,
CreatedUnix : timeutil . TimeStampNow ( ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( reaction , & res ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2020-01-15 12:14:07 +01:00
}
is . Reactions = append ( is . Reactions , & res )
}
2019-06-29 15:38:22 +02:00
iss = append ( iss , & is )
2019-05-07 03:12:51 +02:00
}
2020-04-17 19:42:57 +02:00
if len ( iss ) > 0 {
if err := models . InsertIssues ( iss ... ) ; err != nil {
return err
}
for _ , is := range iss {
2022-02-03 20:18:18 +01:00
g . issues [ is . Index ] = is
2020-04-17 19:42:57 +02:00
}
2019-06-29 15:38:22 +02:00
}
2020-04-17 19:42:57 +02:00
2019-06-29 15:38:22 +02:00
return nil
}
// CreateComments creates comments of issues
func ( g * GiteaLocalUploader ) CreateComments ( comments ... * base . Comment ) error {
2022-06-13 11:37:59 +02:00
cms := make ( [ ] * issues_model . Comment , 0 , len ( comments ) )
2019-06-29 15:38:22 +02:00
for _ , comment := range comments {
2022-06-13 11:37:59 +02:00
var issue * issues_model . Issue
2022-02-03 20:18:18 +01:00
issue , ok := g . issues [ comment . IssueIndex ]
2021-08-18 02:47:18 +02:00
if ! ok {
2022-02-21 14:00:05 +01:00
return fmt . Errorf ( "comment references non existent IssueIndex %d" , comment . IssueIndex )
2019-06-29 15:38:22 +02:00
}
2021-08-18 02:47:18 +02:00
if comment . Created . IsZero ( ) {
comment . Created = time . Unix ( int64 ( issue . CreatedUnix ) , 0 )
}
if comment . Updated . IsZero ( ) {
comment . Updated = comment . Created
}
2022-06-13 11:37:59 +02:00
cm := issues_model . Comment {
2021-08-18 02:47:18 +02:00
IssueID : issue . ID ,
2022-06-13 11:37:59 +02:00
Type : issues_model . CommentTypeComment ,
2019-10-14 08:10:42 +02:00
Content : comment . Content ,
CreatedUnix : timeutil . TimeStamp ( comment . Created . Unix ( ) ) ,
2020-01-14 11:29:22 +01:00
UpdatedUnix : timeutil . TimeStamp ( comment . Updated . Unix ( ) ) ,
2019-10-14 08:10:42 +02:00
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( comment , & cm ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2019-10-14 08:10:42 +02:00
}
2020-01-15 12:14:07 +01:00
// add reactions
for _ , reaction := range comment . Reactions {
2022-03-31 11:20:39 +02:00
res := issues_model . Reaction {
2020-01-15 12:14:07 +01:00
Type : reaction . Content ,
CreatedUnix : timeutil . TimeStampNow ( ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( reaction , & res ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2020-01-15 12:14:07 +01:00
}
cm . Reactions = append ( cm . Reactions , & res )
}
2019-06-29 15:38:22 +02:00
2020-01-15 12:14:07 +01:00
cms = append ( cms , & cm )
2019-06-29 15:38:22 +02:00
}
2020-04-17 19:42:57 +02:00
if len ( cms ) == 0 {
return nil
}
2019-06-29 15:38:22 +02:00
return models . InsertIssueComments ( cms )
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
// CreatePullRequests creates pull requests
func ( g * GiteaLocalUploader ) CreatePullRequests ( prs ... * base . PullRequest ) error {
2022-06-13 11:37:59 +02:00
gprs := make ( [ ] * issues_model . PullRequest , 0 , len ( prs ) )
2019-06-29 15:38:22 +02:00
for _ , pr := range prs {
gpr , err := g . newPullRequest ( pr )
2019-05-07 03:12:51 +02:00
if err != nil {
return err
}
2019-10-14 08:10:42 +02:00
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( pr , gpr . Issue ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2019-10-14 08:10:42 +02:00
}
2019-06-29 15:38:22 +02:00
gprs = append ( gprs , gpr )
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
if err := models . InsertPullRequests ( gprs ... ) ; err != nil {
return err
2019-05-07 03:12:51 +02:00
}
2019-06-29 15:38:22 +02:00
for _ , pr := range gprs {
2022-02-03 20:18:18 +01:00
g . issues [ pr . Issue . Index ] = pr . Issue
2020-10-27 22:34:56 +01:00
pull . AddToTaskQueue ( pr )
2019-06-29 15:38:22 +02:00
}
return nil
2019-05-07 03:12:51 +02:00
}
2022-02-25 10:20:50 +01:00
func ( g * GiteaLocalUploader ) updateGitForPullRequest ( pr * base . PullRequest ) ( head string , err error ) {
2022-09-04 15:41:21 +02:00
// SECURITY: this pr must have been must have been ensured safe
if ! pr . EnsuredSafe {
log . Error ( "PR #%d in %s/%s has not been checked for safety." , pr . Number , g . repoOwner , g . repoName )
return "" , fmt . Errorf ( "the PR[%d] was not checked for safety" , pr . Number )
}
// Anonymous function to download the patch file (allows us to use defer)
2022-02-25 10:20:50 +01:00
err = func ( ) error {
2022-09-04 15:41:21 +02:00
// if the patchURL is empty there is nothing to download
2021-08-22 00:47:45 +02:00
if pr . PatchURL == "" {
return nil
}
2022-09-04 15:41:21 +02:00
// SECURITY: We will assume that the pr.PatchURL has been checked
// pr.PatchURL maybe a local file - but note EnsureSafe should be asserting that this safe
ret , err := uri . Open ( pr . PatchURL ) // TODO: This probably needs to use the downloader as there may be rate limiting issues here
2019-11-05 13:54:47 +01:00
if err != nil {
return err
}
2020-12-27 04:34:19 +01:00
defer ret . Close ( )
2022-09-04 15:41:21 +02:00
2019-11-05 13:54:47 +01:00
pullDir := filepath . Join ( g . repo . RepoPath ( ) , "pulls" )
if err = os . MkdirAll ( pullDir , os . ModePerm ) ; err != nil {
return err
}
2022-09-04 15:41:21 +02:00
2019-11-05 13:54:47 +01:00
f , err := os . Create ( filepath . Join ( pullDir , fmt . Sprintf ( "%d.patch" , pr . Number ) ) )
if err != nil {
return err
}
defer f . Close ( )
2022-09-04 15:41:21 +02:00
// TODO: Should there be limits on the size of this file?
2020-12-27 04:34:19 +01:00
_ , err = io . Copy ( f , ret )
2022-09-04 15:41:21 +02:00
2019-11-05 13:54:47 +01:00
return err
} ( )
2019-05-07 03:12:51 +02:00
if err != nil {
2022-02-25 10:20:50 +01:00
return "" , err
2019-05-07 03:12:51 +02:00
}
2022-02-25 10:20:50 +01:00
head = "unknown repository"
2019-07-14 11:16:15 +02:00
if pr . IsForkPullRequest ( ) && pr . State != "closed" {
2022-09-04 15:41:21 +02:00
// OK we want to fetch the current head as a branch from its CloneURL
// 1. Is there a head clone URL available?
// 2. Is there a head ref available?
if pr . Head . CloneURL == "" || pr . Head . Ref == "" {
return head , nil
}
// 3. We need to create a remote for this clone url
// ... maybe we already have a name for this remote
remote , ok := g . prHeadCache [ pr . Head . CloneURL + ":" ]
if ! ok {
// ... let's try ownername as a reasonable name
remote = pr . Head . OwnerName
if ! git . IsValidRefPattern ( remote ) {
// ... let's try something less nice
remote = "head-pr-" + strconv . FormatInt ( pr . Number , 10 )
}
// ... now add the remote
err := g . gitRepo . AddRemote ( remote , pr . Head . CloneURL , true )
if err != nil {
log . Error ( "PR #%d in %s/%s AddRemote[%s] failed: %v" , pr . Number , g . repoOwner , g . repoName , remote , err )
} else {
g . prHeadCache [ pr . Head . CloneURL + ":" ] = remote
ok = true
2019-05-07 03:12:51 +02:00
}
2022-09-04 15:41:21 +02:00
}
if ! ok {
return head , nil
}
2019-05-07 03:12:51 +02:00
2022-09-04 15:41:21 +02:00
// 4. Check if we already have this ref?
localRef , ok := g . prHeadCache [ pr . Head . CloneURL + ":" + pr . Head . Ref ]
if ! ok {
// ... We would normally name this migrated branch as <OwnerName>/<HeadRef> but we need to ensure that is safe
localRef = git . SanitizeRefPattern ( pr . Head . OwnerName + "/" + pr . Head . Ref )
// ... Now we must assert that this does not exist
if g . gitRepo . IsBranchExist ( localRef ) {
localRef = "head-pr-" + strconv . FormatInt ( pr . Number , 10 ) + "/" + localRef
i := 0
for g . gitRepo . IsBranchExist ( localRef ) {
if i > 5 {
// ... We tried, we really tried but this is just a seriously unfriendly repo
return head , nil
2019-05-07 03:12:51 +02:00
}
2022-09-04 15:41:21 +02:00
// OK just try some uuids!
localRef = git . SanitizeRefPattern ( "head-pr-" + strconv . FormatInt ( pr . Number , 10 ) + uuid . New ( ) . String ( ) )
i ++
2019-05-07 03:12:51 +02:00
}
}
2022-09-04 15:41:21 +02:00
fetchArg := pr . Head . Ref + ":" + git . BranchPrefix + localRef
if strings . HasPrefix ( fetchArg , "-" ) {
fetchArg = git . BranchPrefix + fetchArg
}
_ , _ , err = git . NewCommand ( g . ctx , "fetch" , "--no-tags" , "--" , remote , fetchArg ) . RunStdString ( & git . RunOpts { Dir : g . repo . RepoPath ( ) } )
if err != nil {
log . Error ( "Fetch branch from %s failed: %v" , pr . Head . CloneURL , err )
return head , nil
}
g . prHeadCache [ pr . Head . CloneURL + ":" + pr . Head . Ref ] = localRef
head = localRef
2019-05-07 03:12:51 +02:00
}
2022-09-04 15:41:21 +02:00
// 5. Now if pr.Head.SHA == "" we should recover this to the head of this branch
if pr . Head . SHA == "" {
headSha , err := g . gitRepo . GetBranchCommitID ( localRef )
if err != nil {
log . Error ( "unable to get head SHA of local head for PR #%d from %s in %s/%s. Error: %v" , pr . Number , pr . Head . Ref , g . repoOwner , g . repoName , err )
return head , nil
}
pr . Head . SHA = headSha
}
_ , _ , err = git . NewCommand ( g . ctx , "update-ref" , "--no-deref" , pr . GetGitRefName ( ) , pr . Head . SHA ) . RunStdString ( & git . RunOpts { Dir : g . repo . RepoPath ( ) } )
if err != nil {
return "" , err
}
return head , nil
}
if pr . Head . Ref != "" {
2019-05-07 03:12:51 +02:00
head = pr . Head . Ref
2022-09-04 15:41:21 +02:00
}
// Ensure the closed PR SHA still points to an existing ref
if pr . Head . SHA == "" {
// The SHA is empty
log . Warn ( "Empty reference, no pull head for PR #%d in %s/%s" , pr . Number , g . repoOwner , g . repoName )
} else {
2022-04-01 04:55:30 +02:00
_ , _ , err = git . NewCommand ( g . ctx , "rev-list" , "--quiet" , "-1" , pr . Head . SHA ) . RunStdString ( & git . RunOpts { Dir : g . repo . RepoPath ( ) } )
2021-10-08 11:59:35 +02:00
if err != nil {
2022-09-04 15:41:21 +02:00
// Git update-ref remove bad references with a relative path
log . Warn ( "Deprecated local head %s for PR #%d in %s/%s, removing %s" , pr . Head . SHA , pr . Number , g . repoOwner , g . repoName , pr . GetGitRefName ( ) )
} else {
// set head information
_ , _ , err = git . NewCommand ( g . ctx , "update-ref" , "--no-deref" , pr . GetGitRefName ( ) , pr . Head . SHA ) . RunStdString ( & git . RunOpts { Dir : g . repo . RepoPath ( ) } )
2021-10-08 11:59:35 +02:00
if err != nil {
2022-09-04 15:41:21 +02:00
log . Error ( "unable to set %s as the local head for PR #%d from %s in %s/%s. Error: %v" , pr . Head . SHA , pr . Number , pr . Head . Ref , g . repoOwner , g . repoName , err )
2021-10-08 11:59:35 +02:00
}
}
2019-05-07 03:12:51 +02:00
}
2022-02-25 10:20:50 +01:00
return head , nil
}
2022-06-13 11:37:59 +02:00
func ( g * GiteaLocalUploader ) newPullRequest ( pr * base . PullRequest ) ( * issues_model . PullRequest , error ) {
var labels [ ] * issues_model . Label
2022-02-25 10:20:50 +01:00
for _ , label := range pr . Labels {
lb , ok := g . labels [ label . Name ]
if ok {
labels = append ( labels , lb )
}
}
milestoneID := g . milestones [ pr . Milestone ]
head , err := g . updateGitForPullRequest ( pr )
if err != nil {
return nil , fmt . Errorf ( "updateGitForPullRequest: %w" , err )
}
2022-09-04 15:41:21 +02:00
// Now we may need to fix the mergebase
if pr . Base . SHA == "" {
if pr . Base . Ref != "" && pr . Head . SHA != "" {
// A PR against a tag base does not make sense - therefore pr.Base.Ref must be a branch
// TODO: should we be checking for the refs/heads/ prefix on the pr.Base.Ref? (i.e. are these actually branches or refs)
pr . Base . SHA , _ , err = g . gitRepo . GetMergeBase ( "" , git . BranchPrefix + pr . Base . Ref , pr . Head . SHA )
if err != nil {
log . Error ( "Cannot determine the merge base for PR #%d in %s/%s. Error: %v" , pr . Number , g . repoOwner , g . repoName , err )
}
} else {
log . Error ( "Cannot determine the merge base for PR #%d in %s/%s. Not enough information" , pr . Number , g . repoOwner , g . repoName )
}
}
2021-08-18 02:47:18 +02:00
if pr . Created . IsZero ( ) {
if pr . Closed != nil {
pr . Created = * pr . Closed
} else if pr . MergedTime != nil {
pr . Created = * pr . MergedTime
} else {
pr . Created = time . Now ( )
}
}
if pr . Updated . IsZero ( ) {
pr . Updated = pr . Created
}
2022-06-13 11:37:59 +02:00
issue := issues_model . Issue {
2019-10-14 08:10:42 +02:00
RepoID : g . repo . ID ,
Repo : g . repo ,
Title : pr . Title ,
Index : pr . Number ,
Content : pr . Content ,
MilestoneID : milestoneID ,
IsPull : true ,
IsClosed : pr . State == "closed" ,
IsLocked : pr . IsLocked ,
Labels : labels ,
CreatedUnix : timeutil . TimeStamp ( pr . Created . Unix ( ) ) ,
2020-01-14 11:29:22 +01:00
UpdatedUnix : timeutil . TimeStamp ( pr . Updated . Unix ( ) ) ,
2019-10-14 08:10:42 +02:00
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( pr , & issue ) ; err != nil {
2022-02-01 19:20:28 +01:00
return nil , err
2019-10-14 08:10:42 +02:00
}
2020-01-15 12:14:07 +01:00
// add reactions
for _ , reaction := range pr . Reactions {
2022-03-31 11:20:39 +02:00
res := issues_model . Reaction {
2020-01-15 12:14:07 +01:00
Type : reaction . Content ,
CreatedUnix : timeutil . TimeStampNow ( ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( reaction , & res ) ; err != nil {
2022-02-01 19:20:28 +01:00
return nil , err
2020-01-15 12:14:07 +01:00
}
issue . Reactions = append ( issue . Reactions , & res )
}
2022-06-13 11:37:59 +02:00
pullRequest := issues_model . PullRequest {
2019-10-18 13:13:31 +02:00
HeadRepoID : g . repo . ID ,
HeadBranch : head ,
BaseRepoID : g . repo . ID ,
BaseBranch : pr . Base . Ref ,
MergeBase : pr . Base . SHA ,
Index : pr . Number ,
HasMerged : pr . Merged ,
2019-05-07 03:12:51 +02:00
2019-10-14 08:10:42 +02:00
Issue : & issue ,
2019-05-07 03:12:51 +02:00
}
if pullRequest . Issue . IsClosed && pr . Closed != nil {
2019-08-15 16:46:21 +02:00
pullRequest . Issue . ClosedUnix = timeutil . TimeStamp ( pr . Closed . Unix ( ) )
2019-05-07 03:12:51 +02:00
}
if pullRequest . HasMerged && pr . MergedTime != nil {
2019-08-15 16:46:21 +02:00
pullRequest . MergedUnix = timeutil . TimeStamp ( pr . MergedTime . Unix ( ) )
2019-05-07 03:12:51 +02:00
pullRequest . MergedCommitID = pr . MergeCommitSHA
pullRequest . MergerID = g . doer . ID
}
// TODO: assignees
2019-06-29 15:38:22 +02:00
return & pullRequest , nil
2019-05-07 03:12:51 +02:00
}
2022-06-13 11:37:59 +02:00
func convertReviewState ( state string ) issues_model . ReviewType {
2020-01-23 18:28:15 +01:00
switch state {
case base . ReviewStatePending :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypePending
2020-01-23 18:28:15 +01:00
case base . ReviewStateApproved :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypeApprove
2020-01-23 18:28:15 +01:00
case base . ReviewStateChangesRequested :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypeReject
2020-01-23 18:28:15 +01:00
case base . ReviewStateCommented :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypeComment
2022-06-09 04:50:05 +02:00
case base . ReviewStateRequestReview :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypeRequest
2020-01-23 18:28:15 +01:00
default :
2022-06-13 11:37:59 +02:00
return issues_model . ReviewTypePending
2020-01-23 18:28:15 +01:00
}
}
2022-02-21 14:00:05 +01:00
// CreateReviews create pull request reviews of currently migrated issues
2020-01-23 18:28:15 +01:00
func ( g * GiteaLocalUploader ) CreateReviews ( reviews ... * base . Review ) error {
2022-06-13 11:37:59 +02:00
cms := make ( [ ] * issues_model . Review , 0 , len ( reviews ) )
2020-01-23 18:28:15 +01:00
for _ , review := range reviews {
2022-06-13 11:37:59 +02:00
var issue * issues_model . Issue
2022-02-03 20:18:18 +01:00
issue , ok := g . issues [ review . IssueIndex ]
2021-08-18 02:47:18 +02:00
if ! ok {
2022-02-21 14:00:05 +01:00
return fmt . Errorf ( "review references non existent IssueIndex %d" , review . IssueIndex )
2020-01-23 18:28:15 +01:00
}
2021-08-18 02:47:18 +02:00
if review . CreatedAt . IsZero ( ) {
review . CreatedAt = time . Unix ( int64 ( issue . CreatedUnix ) , 0 )
}
2022-06-13 11:37:59 +02:00
cm := issues_model . Review {
2020-01-23 18:28:15 +01:00
Type : convertReviewState ( review . State ) ,
2021-08-18 02:47:18 +02:00
IssueID : issue . ID ,
2020-01-23 18:28:15 +01:00
Content : review . Content ,
Official : review . Official ,
CreatedUnix : timeutil . TimeStamp ( review . CreatedAt . Unix ( ) ) ,
UpdatedUnix : timeutil . TimeStamp ( review . CreatedAt . Unix ( ) ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( review , & cm ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2020-01-23 18:28:15 +01:00
}
2022-09-04 15:41:21 +02:00
cms = append ( cms , & cm )
2020-01-23 18:28:15 +01:00
// get pr
2021-08-18 02:47:18 +02:00
pr , ok := g . prCache [ issue . ID ]
2020-01-23 18:28:15 +01:00
if ! ok {
var err error
2022-06-13 11:37:59 +02:00
pr , err = issues_model . GetPullRequestByIssueIDWithNoAttributes ( issue . ID )
2020-01-23 18:28:15 +01:00
if err != nil {
return err
}
2021-08-18 02:47:18 +02:00
g . prCache [ issue . ID ] = pr
2020-01-23 18:28:15 +01:00
}
2022-09-04 15:41:21 +02:00
if pr . MergeBase == "" {
// No mergebase -> no basis for any patches
log . Warn ( "PR #%d in %s/%s: does not have a merge base, all review comments will be ignored" , pr . Index , g . repoOwner , g . repoName )
continue
}
headCommitID , err := g . gitRepo . GetRefCommitID ( pr . GetGitRefName ( ) )
if err != nil {
log . Warn ( "PR #%d GetRefCommitID[%s] in %s/%s: %v, all review comments will be ignored" , pr . Index , pr . GetGitRefName ( ) , g . repoOwner , g . repoName , err )
continue
}
2020-01-23 18:28:15 +01:00
for _ , comment := range review . Comments {
2020-10-14 06:06:00 +02:00
line := comment . Line
if line != 0 {
comment . Position = 1
} else {
_ , _ , line , _ = git . ParseDiffHunkString ( comment . DiffHunk )
}
2022-09-04 15:41:21 +02:00
// SECURITY: The TreePath must be cleaned!
comment . TreePath = path . Clean ( "/" + comment . TreePath ) [ 1 : ]
2020-01-28 09:02:03 +01:00
var patch string
2021-02-27 19:46:14 +01:00
reader , writer := io . Pipe ( )
defer func ( ) {
_ = reader . Close ( )
_ = writer . Close ( )
} ( )
2022-04-15 16:50:09 +02:00
go func ( comment * base . ReviewComment ) {
2021-02-27 19:46:14 +01:00
if err := git . GetRepoRawDiffForFile ( g . gitRepo , pr . MergeBase , headCommitID , git . RawDiffNormal , comment . TreePath , writer ) ; err != nil {
// We should ignore the error since the commit maybe removed when force push to the pull request
log . Warn ( "GetRepoRawDiffForFile failed when migrating [%s, %s, %s, %s]: %v" , g . gitRepo . Path , pr . MergeBase , headCommitID , comment . TreePath , err )
}
_ = writer . Close ( )
2022-04-15 16:50:09 +02:00
} ( comment )
2021-02-27 19:46:14 +01:00
2022-06-13 11:37:59 +02:00
patch , _ = git . CutDiffAroundLine ( reader , int64 ( ( & issues_model . Comment { Line : int64 ( line + comment . Position - 1 ) } ) . UnsignedLine ( ) ) , line < 0 , setting . UI . CodeCommentLines )
2020-01-23 18:28:15 +01:00
2021-08-18 02:47:18 +02:00
if comment . CreatedAt . IsZero ( ) {
comment . CreatedAt = review . CreatedAt
}
if comment . UpdatedAt . IsZero ( ) {
comment . UpdatedAt = comment . CreatedAt
}
2022-09-04 15:41:21 +02:00
if ! git . IsValidSHAPattern ( comment . CommitID ) {
log . Warn ( "Invalid comment CommitID[%s] on comment[%d] in PR #%d of %s/%s replaced with %s" , comment . CommitID , pr . Index , g . repoOwner , g . repoName , headCommitID )
comment . CommitID = headCommitID
}
2022-06-13 11:37:59 +02:00
c := issues_model . Comment {
Type : issues_model . CommentTypeCode ,
2021-08-18 02:47:18 +02:00
IssueID : issue . ID ,
2020-01-23 18:28:15 +01:00
Content : comment . Content ,
Line : int64 ( line + comment . Position - 1 ) ,
TreePath : comment . TreePath ,
CommitSHA : comment . CommitID ,
Patch : patch ,
CreatedUnix : timeutil . TimeStamp ( comment . CreatedAt . Unix ( ) ) ,
UpdatedUnix : timeutil . TimeStamp ( comment . UpdatedAt . Unix ( ) ) ,
}
2022-02-06 10:05:29 +01:00
if err := g . remapUser ( review , & c ) ; err != nil {
2022-02-01 19:20:28 +01:00
return err
2020-01-23 18:28:15 +01:00
}
cm . Comments = append ( cm . Comments , & c )
}
}
2022-06-13 11:37:59 +02:00
return issues_model . InsertReviews ( cms )
2020-01-23 18:28:15 +01:00
}
2019-05-07 03:12:51 +02:00
// Rollback when migrating failed, this will rollback all the changes.
func ( g * GiteaLocalUploader ) Rollback ( ) error {
if g . repo != nil && g . repo . ID > 0 {
2021-05-14 22:19:38 +02:00
g . gitRepo . Close ( )
2019-05-07 03:12:51 +02:00
if err := models . DeleteRepository ( g . doer , g . repo . OwnerID , g . repo . ID ) ; err != nil {
return err
}
}
return nil
}
2020-12-27 04:34:19 +01:00
// Finish when migrating success, this will do some status update things.
func ( g * GiteaLocalUploader ) Finish ( ) error {
if g . repo == nil || g . repo . ID <= 0 {
return ErrRepoNotCreated
}
2021-08-13 15:06:18 +02:00
// update issue_index
2022-06-13 11:37:59 +02:00
if err := issues_model . RecalculateIssueIndexForRepo ( g . repo . ID ) ; err != nil {
2021-08-13 15:06:18 +02:00
return err
}
2022-03-22 16:22:54 +01:00
if err := models . UpdateRepoStats ( g . ctx , g . repo . ID ) ; err != nil {
2022-01-17 19:31:58 +01:00
return err
}
2021-12-10 02:27:50 +01:00
g . repo . Status = repo_model . RepositoryReady
2022-05-20 16:08:52 +02:00
return repo_model . UpdateRepositoryCols ( g . ctx , g . repo , "status" )
2020-12-27 04:34:19 +01:00
}
2022-02-01 19:20:28 +01:00
2022-02-06 10:05:29 +01:00
func ( g * GiteaLocalUploader ) remapUser ( source user_model . ExternalUserMigrated , target user_model . ExternalUserRemappable ) error {
var userid int64
var err error
if g . sameApp {
userid , err = g . remapLocalUser ( source , target )
} else {
userid , err = g . remapExternalUser ( source , target )
}
if err != nil {
return err
}
if userid > 0 {
return target . RemapExternalUser ( "" , 0 , userid )
}
return target . RemapExternalUser ( source . GetExternalName ( ) , source . GetExternalID ( ) , g . doer . ID )
}
func ( g * GiteaLocalUploader ) remapLocalUser ( source user_model . ExternalUserMigrated , target user_model . ExternalUserRemappable ) ( int64 , error ) {
2022-02-01 19:20:28 +01:00
userid , ok := g . userMap [ source . GetExternalID ( ) ]
2022-02-06 10:05:29 +01:00
if ! ok {
name , err := user_model . GetUserNameByID ( g . ctx , source . GetExternalID ( ) )
2022-02-01 19:20:28 +01:00
if err != nil {
2022-02-06 10:05:29 +01:00
return 0 , err
2022-02-01 19:20:28 +01:00
}
2022-02-06 10:05:29 +01:00
// let's not reuse an ID when the user was deleted or has a different user name
if name != source . GetExternalName ( ) {
userid = 0
} else {
userid = source . GetExternalID ( )
2022-02-01 19:20:28 +01:00
}
2022-02-06 10:05:29 +01:00
g . userMap [ source . GetExternalID ( ) ] = userid
2022-02-01 19:20:28 +01:00
}
2022-02-06 10:05:29 +01:00
return userid , nil
}
2022-02-01 19:20:28 +01:00
2022-02-06 10:05:29 +01:00
func ( g * GiteaLocalUploader ) remapExternalUser ( source user_model . ExternalUserMigrated , target user_model . ExternalUserRemappable ) ( userid int64 , err error ) {
userid , ok := g . userMap [ source . GetExternalID ( ) ]
if ! ok {
userid , err = user_model . GetUserIDByExternalUserID ( g . gitServiceType . Name ( ) , fmt . Sprintf ( "%d" , source . GetExternalID ( ) ) )
if err != nil {
log . Error ( "GetUserIDByExternalUserID: %v" , err )
return 0 , err
}
g . userMap [ source . GetExternalID ( ) ] = userid
2022-02-01 19:20:28 +01:00
}
2022-02-06 10:05:29 +01:00
return userid , nil
2022-02-01 19:20:28 +01:00
}