Skip to main content
deleted 3 characters in body; edited title
Source Link
Jamal
  • 35.2k
  • 13
  • 134
  • 238

Quickly check whether file name is in column in Sqlan SQL Table and move to new folder

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sqlan SQL command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db'
$db_name = 'DB2'
$sql_user = 'user'
$sql_user_pswd = 'password'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. Idatabase? I have to assume it would be quicker.

Quickly check whether file name is in column in Sql Table and move to new folder

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sql command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db'
$db_name = 'DB2'
$sql_user = 'user'
$sql_user_pswd = 'password'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. I have to assume it would be quicker.

Quickly check whether file name is in column in an SQL Table and move to new folder

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using an SQL command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db'
$db_name = 'DB2'
$sql_user = 'user'
$sql_user_pswd = 'password'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the database? I have to assume it would be quicker.

deleted 11 characters in body
Source Link

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sql command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db3''db'
$db_name = 'CAMA_DB2''DB2'
$sql_user = 'Cama''user'
$sql_user_pswd = 'th!sAppR0cks!''password'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. I have to assume it would be quicker.

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sql command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db3'
$db_name = 'CAMA_DB2'
$sql_user = 'Cama'
$sql_user_pswd = 'th!sAppR0cks!'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. I have to assume it would be quicker.

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sql command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db'
$db_name = 'DB2'
$sql_user = 'user'
$sql_user_pswd = 'password'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. I have to assume it would be quicker.

Source Link

Quickly check whether file name is in column in Sql Table and move to new folder

I have a table that in one of the columns exists a list of file names. The names comes from other tables and the files in a folder.

Not all the files are used, may be attached to parcels that no longer are active. My list is of only the active parcels.

After creating the table I want to go through the folder of files and move the files that are being used(exist in the table) to another folder.

Currently I iterate the files in the folder and test whether each file one at a time returns a record or not using a sql command.

This works, but the consistent checking back to the database is slow.

Here is the code as it sits now:

param( 
    [string] $src,  
    [string] $dest, 
    [string] $table
    )

$sql_instance_name = 'db3'
$db_name = 'CAMA_DB2'
$sql_user = 'Cama'
$sql_user_pswd = 'th!sAppR0cks!'



Get-ChildItem -Path $src -Recurse -File | ForEach-Object  {
    $query = "select * from " + $table + " WHERE FILE_NAME Like '" + $_.BaseName + $_.Extension +  "'"
    #write-output $query


    $expcsv = invoke-sqlcmd -Username $sql_user -PASSWORD $sql_user_pswd -Database $db_name -Query $query -serverinstance $sql_instance_name
    if($expcsv -ne $null)

    {
        $nextName = Join-Path -Path $dest ( $_.BaseName  +  $_.EXTENSION)
        Write-Output $nextName
        $_ | Move-Item -Destination   $nextName
    }
}

Is there a way to load the data into memory, and search that instead of going back and forth from the data base. I have to assume it would be quicker.