我想用保持通行证(win10x64)。
问题是,这些年来我的密码都是这样收集的:
每个用户名/密码对都在一个单独的文本文件中,
文件名是相关密码的域名,
在每个 .txt 文件中,第一行是用户名和第二个密码(都没有前缀)。
例如:
文件名:(reddit.com
也有 .txt 扩展名,但可以省略) 内容:
my_username
my_Password
为了使上述内容成为 CSV,必须将其转换为:
"Account","Login Name","Password","Web Site", "Notes"
"Entry 1","my_username","my_password","http://reddit.com", ""
IE
"Account","Login Name", "Password", "Web Site", "Notes"
"Entry 1","my_username", "my_password", "http://reddit.com", ""
注意:如果密码中存在逗号 ( ,
)、双引号 ( "
) 或反斜杠 ( ),则必须使用反斜杠进行转义。\
如何通过 bash 脚本 (cygwin) 来完成此操作?
到目前为止我收集的代码片段:
如何在文件中每行的开头添加一个字符串? (来源)
awk '{print "prefix" $0}' file
打印 file.txt 中的第 n 行 (来源)
awk "NR==2{print;exit}" file.txt // for printing the 2nd line
递归地对所有找到的文件执行命令
find . -name "*.txt" -execdir *command*
或者替代地
for file in /cygdrive/c/folder/*; do *command* // command example: mv "$file" "${file}/.." done
所以,我的脚本应该这样开始:
find . -name "*.txt" -execdir awk '{print '"Account","Account","Login Name","Password","Web Site", "Notes"' $0}' "$file"
我被困在这一点上。
答案1
输入文件:
- yahoo.net.txt
- 谷歌.com.txt
- reddit.com.txt
内容yahoo.net.txt
:
user1-yaho0
pas,,
内容gougle.com.txt
:
user1-google
pas"wor,d
内容reddit.com.txt
:
user1-reddit
pas\wor\d
通过此输入,以下快速但肮脏的脚本可以完成这项工作:
#!/bin/bash
echo "\"Account\",\"Login Name\",\"Password\",\"Web Site\",\"Notes\"" > output.csv
num=0
for f in `find . -type f -name '*.txt'` ; do
num=$((num + 1))
user=`head -1 "$f"`
password=`sed 1d "$f" | sed -r 's/([\\,"])/\\\\\1/g'`
domain=`basename "$f" | sed 's/\.txt$//'`
echo "\"$num\",\"$user\",\"$password\",\"$domain\",\"\"" >> output.csv
done
内容output.csv
:
"Account","Login Name","Password","Web Site","Notes"
"1","user1-reddit","pas\\wor\\d","reddit.com",""
"2","user1-yaho","pas\,\,","yahoo.net",""
"3","user1-google","pas\"wor\,d","gougle.com",""
答案2
和awk
awk 'BEGIN{print "\"Account\",\"Login Name\",\"Password\",\"Web Site\",\"Notes\""}
{ if (FNR==1){
user=$0;a+=1};
if(FNR==2){
print "\""a"\",\""user"\",\""$0"\",\""gensub(/\.txt/,"","",FILENAME)"\","}}' *.txt
使用 awk 特殊变量FNR
从每个文件中获取第一行和第二行*.txt
,并使用字符串操作函数从 URL 列(特殊变量)中gensub
删除。.txt
FILENAME
awk