22

I have a fairly standard csv file with headers I want to add a new column & set all the rows to the same data.

Original:

column1, column2
1,b
2,c
3,5

After

column1, column2, column3
1,b, setvalue
2,c, setvalue
3,5, setvalue

I can't find anything on this if anybody could point me in the right direction that would be great. Sorry very new to Power Shell.

1
  • Is there a code piece that generates the column1 (serial numbers) programatically? Commented Jun 12, 2017 at 11:04

6 Answers 6

62

Here's one way to do that using Calculated Properties:

Import-Csv file.csv | 
Select-Object *,@{Name='column3';Expression={'setvalue'}} | 
Export-Csv file.csv -NoTypeInformation

You can find more on calculated properties here: http://technet.microsoft.com/en-us/library/ff730948.aspx.

In a nutshell, you import the file, pipe the content to the Select-Object cmdlet, select all exiting properties (e.g '*') then add a new one.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks that's spot on. Really appreciate it.
Hi, Sorry for reviving this old post, but i get an empty file when exporting to the same file name. It works fine when i export to a file with a different name than the original. Is there any reason why this happens?
This seems to be related to another answer by Levy where the same file was used for reading and writing. Just wrap the first line in parentheses and you're good to go! =)
Wrapping the first line in parentheses did not work for me.
14

The ShayLevy's answer also works for me!

If you don't want to provide a value for each object yet the code is even easier...

Import-Csv file.csv | 
Select-Object *,"column3" | 
Export-Csv file.csv -NoTypeInformation

1 Comment

Hi, Sorry for reviving this old post, but i get an empty file when exporting to the same file name. It works fine when i export to a file with a different name than the original. Is there any reason why this happens?
5

You could also use Add-Member:

$csv = Import-Csv 'input.csv'

foreach ($row in $csv)
{
    $row | Add-Member -NotePropertyName 'MyNewColumn' -NotePropertyValue 'MyNewValue'
}

$csv | Export-Csv 'output.csv' -NoTypeInformation

Comments

3

None of the scripts I've seen are dynamic in nature, so they're fairly limited in their scope & what you can do with them.. that's probably because most PS Users & even Power Users aren't programmers. You very rarely see the use of arrays in Powershell. I took Shay Levy's answer & improved upon it.

Note here: The Import needs to be consistent (two columns for instance), but it would be fairly easy to modify this to dynamically count the columns & generate headers that way too. For this particular question, that wasn't asked. Or simply don't generate a header unless it's needed.

Needless to say the below will pull in as many CSV files that exist in the folder, add a header, and then later strip it. The reason I add the header is for consistency in the data, it makes manipulating the columns later down the line fairly straight forward too (if you choose to do so). You can modify this to your hearts content, feel free to use it for other purposes too. This is generally the format I stick with for just about any of my Powershell needs. The use of a counter basically allows you to manipulate individual files, so there's a lot of possibilities here.

$chargeFiles = 'C:\YOURFOLDER\BLAHBLAH\'
$existingReturns = Get-ChildItem $chargeFiles

for ($i = 0; $i -lt $existingReturns.count; $i++)
{ 

$CSV = Import-Csv -Path $existingReturns[$i].FullName -Header Header1,Header2
$csv | select *, @{Name='Header3';Expression={'Header3 Static'}}
     | select *, @{Name='Header4';Expression={'Header4 Static Tet'}}
     | select *, @{Name='Header5';Expression={'Header5 Static Text'}}|
CONVERTTO-CSV -DELIMITER "," -NoTypeInformation |
SELECT-OBJECT -SKIP 1 | % {$_ -replace '"', ""} | 
OUT-FILE -FilePath $existingReturns[$i].FullName -FORCE -ENCODING ASCII

}

Comments

0

For some applications, I found that producing a hashtable and using the .values as the column to be good (it would allow for cross reference validation against another object that was being enumerated).

In this case, #powershell on freenode brought my attention to an ordered hashtable (since the column header must be used).

Here is an example without any validation the .values

$newcolumnobj =  [ordered]@{}
#input data into a hash table so that we can more easily reference the `.values` as an object to be inserted in the CSV
$newcolumnobj.add("volume name", $currenttime)

#enumerate $deltas [this will be the object that contains the volume information `$volumedeltas`)
#  add just the new deltas to the newcolumn object
foreach ($item in $deltas){ 
  $newcolumnobj.add($item.volume,$item.delta)
}

$originalcsv = @(import-csv $targetdeltacsv)

#thanks to pscookiemonster in #powershell on freenode
for($i=0; $i -lt $originalcsv.count; $i++){
  $originalcsv[$i] | Select-Object *, @{l="$currenttime"; e={$newcolumnobj.item($i)}}
}

Example is related to How can I perform arithmetic to find differences of values in two CSVs?

Comments

-3

create a csv file with nothin in it

$csv >> "$PSScriptRoot/dpg.csv"

define the csv file's path. here $psscriptroot is the root of the script

$csv = "$PSScriptRoot/dpg.csv"

now add columns to it

$csv | select vds, protgroup, vlan, ports | Export-Csv $csv

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.