top of page
Writer's pictureRattandeep singh

Lazy Administrator: Split the list.



Hello All, I am back with a new trick I generally use when I am not in the mood to use .csv files for testing to work with some data. Primarily working with some user, group or computer objects in Active Directory.


Many times I had a need to make bulk changes to security groups or user account to change the description, etc...


As a smat administrator will you test your PowerShell script with a few groups or accounts?

I will definitely test to be sure and not make mistakes. But, at the same time my inner lazy admin wakes up and does not want to create a new .csv file with a test object and use it in the script by changing the name using the new file path.


So, I found a solution to copy the list of users in the script and process it to get the desired output.


Close your mouth.. It is possible. One can use the data from a simple text format and split it to multiple values.

I will now simply try to store 3 security groups in a variable named $data, like it has been copied from a .csv file when opened in Excel.

 PS C:\Users\rsingh> $data = "TestGroup1
>> testgroup2
>> testgroup3
>> testgroup4"
 

Now, if I try to display the value using the variable in the console, the output seems ok.

It seems all the groups are displayed.

PS C:\Users\rsingh> $data
TestGroup1
testgroup2
testgroup3
testgroup4
What is wron in the above output?

Assuming the above output is perfect, I want to print the first value "TestGroup1".

I will use the index value to do so. The output is not as expected. It is only one letter of the first group. Let's fix this.

PS C:\Users\rsingh> $data[0]
T
PS C:\Users\rsingh> $data[1]
e
PS C:\Users\rsingh> $data[3]
t

In order to get each group separately, I will process the $data to split the text based on the new line and trim it to remove any extra spaces.

PS C:\Users\rsingh> $data = $data.Split("`n").Trim()

After processing the $data with Split() and Trim() the output with the index values is the full group name.

PS C:\Users\rsingh> $data[0]
TestGroup1
PS C:\Users\rsingh> $data[1]
testgroup2
PS C:\Users\rsingh> $data[2]
testgroup3
Now, a real world example.

Below is a script to generate the details of the security groups added directly into a variable.




## data stored in a simple text format, each value in a new line.
$data = "TestGroup1
testgroup2
testgroup3
testgroup4"

## Processing the data using split() function and trim() function
    $data = $data.Split("`n").Trim()

## Created new valriable to keep track of the count of iterations in the loop
$i = 0

## Created new variable as an empty array to be used later 
$output = @()

## Starting the for loop to process the records from the list stored in $data
foreach($record in $data)
{
    ## Increasing the count
    $i++
    
    ## Screen message with the count and the record being processed
    Write-Host "Working on record number:: $($i) and the value being processed is :: $($record)"
    
    ## Storing the output in the array
    $output+= Get-ADGroup -Identity $record | select SamAccountName,DistinguishedName
}
## Diplaying the out on the console screen
$output

Working on record number:: 1 and the value being processed is :: TestGroup1

The output looks clean. Script broke the list into objects and processed it to get the security group details.

Working on record number:: 2 and the value being processed is :: testgroup2
Working on record number:: 3 and the value being processed is :: testgroup3
Working on record number:: 4 and the value being processed is :: testgroup4

SamAccountName DistinguishedName                          
-------------- -----------------                          
TestGroup1     CN=TestGroup1,OU=TestOU,DC=designbots,DC=in
TestGroup2     CN=TestGroup2,OU=TestOU,DC=designbots,DC=in
TestGroup3     CN=TestGroup3,OU=TestOU,DC=designbots,DC=in
TestGroup4     CN=TestGroup4,OU=TestOU,DC=designbots,DC=in

Note: I have worked with script on maximum of 7000 objects till now and haven't faced any issue.
Try it yourself and if you find it useful. Share, like and comment.
51 views0 comments

Comentários


bottom of page